Composite Models vs Reusable Datasets – Ep. 460
Should you use composite models or reusable datasets to share semantic model logic across teams? Mike and Tommy compare the approaches, trade-offs, and best-fit scenarios. Plus, a listener shoutout for Translytical Task Flows.
Beat from the Street
Listener feedback highlights Translytical Task Flows as a standout Fabric feature—combining transactional and analytical workloads in a streamlined flow.
Main Discussion: Composite Models vs. Reusable Datasets
The Problem: Model Reuse
Organizations often need:
- A central “golden” semantic model with core business logic
- Department-specific extensions with additional measures or tables
- A way to share definitions without copy-pasting across models
Two approaches exist: composite models and reusable datasets (live connections with local models).
Composite Models
- Connect to an existing published semantic model via DirectQuery
- Add local import tables, measures, or relationships
- The base model stays centralized; extensions are local
- Pros: Flexible, powerful, enables departmental customization
- Cons: Performance considerations (DirectQuery overhead), complexity in managing dependencies
Reusable Datasets (Live Connection + Local Model)
- Connect to a shared semantic model
- Add local measures on top (without adding tables)
- Simpler than full composite models
- Pros: Lightweight, fast to set up, maintains single source of truth
- Cons: Limited to measures only (no local tables), less flexibility
When to Use Which
- Composite models when departments need their own data alongside the central model
- Reusable datasets when you just need additional measures on top of a shared model
- Neither when the central model already has everything teams need
- Consider model chaining patterns where multiple models build on each other
Governance Implications
- Both approaches require clear ownership of the base model
- Changes to the base model can break downstream composite models
- Documentation and communication are critical
- Testing downstream impacts before publishing base model changes
Looking Forward
Model reuse patterns will become more important as organizations standardize on fewer, higher-quality semantic models. The combination of composite models, UDFs, and git-based workflows makes it possible to build a modular, maintainable semantic model architecture.
Episode Transcript
Full verbatim transcript — click any timestamp to jump to that moment:
0:04 Good morning and welcome back to the
0:36 Explicit measures podcast with Tommy and Mike. Good morning everyone. Good morning Mike. How you doing? I’m doing great. I’m over at the Microsoft Fabric Conference as we speak. I believe this is our Thursday episode and this is the I think last day, second to last day somewhere around there of the Microsoft Fabric Conference. I think it goes Tuesday through Thursday. So conference. Yes, exactly. So I’m probably sleeping now. No, I’m up now. It’s cuz it’s 8 hour shift. So I’ll be up. Good morning. I’m probably talking
1:08 To you right now somewhere at the conference. So hello and welcome. I hope you’re enjoying the conference for those of you who were able to attend the Microsoft Fabric Conferences. I will say though, Tommy, the conferences have been getting more fun over these last couple years going to Las Vegas. Now, they’re going to be doing in Vienna, and there’s going to be the next one in 2026 over in Atlanta, Georgia, I think, is the next announcement. Okay. So, that’ll be really fun. It’s going to be It’s an art finally. It’s on our side of the country. We It’s just It’s just to fly south to Atlanta. Easy,
1:42 Man. Will you make it? This is that I’m going to try to I’m trying to go to San Francisco, I think. Yeah. that Atlanta is where we met. So, it’s a very it’s a very sacred ground. Atlanta is where we met. Do you is that was that one of the Microsoft data insights conferences? That was the business application summit. Was it business application back then? Yes, it was because I dude, I don’t know if this, but I I submitted a session and
2:14 I had outside of the keynote the most pe the the highest number of people who like said they were going to join. Like yeah, it was like in the it was in the 3,000 and they’re like you’re having a theater so we may have to move because this is not going to I was like okay cool. And then I had a fever. So, literally I’m in I was in in the hallway just trying to get some energy and then Chuck walks by with you and Seth and goes, “Hey, this is the guy want you to meet.” I’m like, “How you doing?” So,
2:46 You guys laughed and kept walking away. Yeah, I remember that, man. Amazing. I don’t remember that. That I remember the building though a little bit. I think it was it was next to the mic. It was next to the Mercedes stadium, I think it was at the time. Giant conference center. So huge conference center and it was actually pretty And did we did we all stay in the same Airbnb at the time or that wasn’t Oh no I stayed in my own my hotel room. Yeah. So cuz around that time in the conferences I started with other MVPs when we went to conferences we would like
3:18 Connect up and say hey let’s go get our own Airbnb and like split the cost. It was much cheaper to get a hotel like that and then you have like a full kitchen. You can get your own breakfast. Like it’s a little bit more flexible I guess. So, I remember going there and we were we had an Airbnb right next to the conference center and we would walk over every morning like a block and a half to get to the conference center. I do remember that part. Dude, the room they sent me to felt like it was a block and a half because that convention center was it was massive. We were in a I had a walk to see which was a very long walk. I’m like well I guess so. I I distinctly remember that
3:54 Conference because I believe we had a visual visualizing and all the top tips and tricks from PowerBI tips one of those things and I just happened to be scheduled at the exact same time as power hour which was horrible because everyone wanted to go to power hour and no one wanted to come to my session. So, I was super energetic for the five people that showed up to my session because everyone else, it was at the end of the day. It was the last conference day. Everyone was ramping down and then they had power hour and everyone wanted to cram in for power hour. So, that was
4:28 Just quite funny. That’s hilarious. Anyways, let’s talk about our topic today and then we’ll go into a beat from the street here, some news articles and then we’ll we’ll we’ll go back into the main topic today. Our topic is going to be around the composite model and reusable data sets. Are we getting confused and sending mixed messages depending on what we’re sending here? So, this is going to be interesting. A good topic here, I think. , composite models exist, but maybe we should be building reusable data sets instead. What’s the best place to go? What’s the best place to use and build
5:00 This stuff with? So, , I want to explore that real quick. Before we get into the main topic though, I have a little bit of a call it a beat from the street. I’ve been doing a lot of work around Transitical. Have you played with Trans Litical at all, Tommy? So, dabbling. I know you you’ve got a story. You were just showing me you’re going back to forget the cloud, we’re going local. So, what are you doing, Mike? Yeah, I’m just playing with Trans. So, over the last couple weeks, I’ve been doing a couple like webinars around Transletico. I’ve been building examples around it. , I had a a recent client ask me a question said, “Hey, we
5:33 Have an inventory issue.” And the issue really was, “We have inventory at multiple locations, different data warehouses.” And so they needed the ability for all the manager of the warehouses to be able to see all inventory across all locations. And then when your parts are low, again, you’re building products part for a product that you need, you should be able to at least request it from another location that has a very high inventory, something along those lines, right? weeks on hand is becomes very important and so it’s e like if you’re running low or short or
6:05 Out of something you can have someone ship it over to you and then you can have those parts rebalancing basically your inventory levels I built a whole transitical workload around that that’ll be one of the demos that we do on Tuesday at fabriccon which is in the past now because you’ll be seeing this video now in the future but that’s one of the demos that we’re going to be doing and so in addition to that I’m so I’m I’m understanding a lot more around translitical task flows. I can see why this was such a big deal for Microsoft because there’s a lot of two-way communication that needs to happen between multiple different services. For
6:37 Example, the report needs to send data. The user data function needs to receive the data. The user data function needs to do multiple things. And the user data function needs to be like, I’m going to execute this SQL query. I’m going to send this data. I’m going to do XYZ things. But the the user data function needs to have like a completion signal like hey I’m done. I’ve completed I’ve been able to run successfully or not successfully and that data is then is funneled back into the report and then when the report receives the message
7:10 Back from the user data function it refreshes itself. There’s a there’s a lot of like little micro interactions that are happening between services that just have to work. It it just has to be reliable and they dialed it in and I’m actually quite impressed. There’s a lot of experience experiences you can build here. I I had two two clients almost the same week ask me hey we need to add annotations on top of our data. I was like great that like that that is exactly what you want to do. Here’s some data. Select the piece of data and then write down or
7:42 Write into that data directly. But a lot of people I think are really trying to unpack and figure out what this looks like. Anyways, very interesting. Very excited to see how this is going to be made. So outside of using it on a Raspberry Pi, this is that’s I think just showing the example, it’s one of those internet things. It can be any host or platform. Think about IoT. this is a function, right? The function can talk like it’s Azure. It can talk to anything. So I’m working on another demo or example and maybe we’ll do it here on the podcast or or actually have
8:14 Like a quick tips to show people how it’s actually built. , but I also made a translitical workflow where you could go into the report, put some text into a text slicer and when you hit send, it will send it to the function. The function will then execute, send the data to a Azure IoT hub, which will then send it directly to a single device. The device will then display the message and then message back that it’s complete. So now the user data function can now signal that you’ve you’ve sent a message and it’s been received. How cool
8:47 Is that, man? Dude, and I and I think too like web hooks, all those things where it it’s showcasing the automation side of this, too. Yes. Obviously, you’re doing that the manual side, but this is a lot. It doesn’t have to be a Raspberry Pi. It doesn’t have to be a physical device. It could literally be just some trigger that set it up. Anything anything you can think of. And as long as it has an API to it, you can do it. If this, then that could be leveraged here. , we were joking around building this demo like someone could actually make a report that would open their garage door if they wanted. Like they could integrate with that if
9:19 They really wanted to. So turn on their lights or turn off their lights. Like this could be all integrated into a PowerBar report at this point, which is , , mind-blowing. The the thing I I think I like the most about this experience though is okay fine transit task flows. Don’t like the name buying whatever it is. The neat part about this is how well it integrates with every single other part of the product, right? It’s it’s the send click a button, send some data, and then the report just does it. It just works. Like there’s I’m not
9:52 Wiring a bunch of extra things in. It’s a little bit tricky to set up, but what it’s not difficult. You can you can gather what it’s trying to do. And so, , very exciting. I’m I’m really excited to see where the people are going to take this one. There’s already a transletical task flow gallery that’s out there. There’s already a couple submissions in it today currently. it’s actually really pretty neat. Nice, dude. And I know those other we talked about that about those samples like I don’t know where this is going. I can see it’s code, but I don’t know where to put it and you like where the
10:25 Output is going to go. So it’s good to see though too with these solutions that it’s we’re having a little more context. I don’t know you were never huge into power apps but there the community there I think by Microsoft they had the probably the best GitHub repository of solutions. Yeah. And they pro and each one had obviously the solution code and like the like the install instructions what it does an image of it. So it was really nice to say like oh okay so this would do this is that gonna meet my needs okay here’s how I need to install here’s the
10:58 Configuration. Sure. So rather than just pure code or zip files that you’re looking at. I agreed that’s actually really nice to have images or some examples of like how that works. One thing I’ll just note here around the power apps. I like I really like power Tom. You’ve done a lot of power apps in in the in the past and it’s really nice to build apps. I think Microsoft had the right idea. A low code or no code solution to building applications is the right thing. But I don’t think Microsoft was anticipating this whole large language model vibe coding experience to just take
11:31 Over. And it honestly, Tommy, I feel like every week I’m seeing massive improvements with like better code, better prompting. And I heard something that I want your reaction to. Okay. All right. I’m listening. The reaction someone I heard someone was talking about their company and how they do almost all of their code with using agents to help them code. Okay. Okay. I don’t like where this is going. Okay, so they said in their experience they found that doing vibe coding is like writing poetry
12:04 And instead of trying to use poetry to evoke an emotion or a feeling or something else, you’re using the words to evoke an application, a feature, a design. So, it’s like this, , like poetry isn’t a science. It’s it’s a bit of art and science mixed together. Okay. Right. You could have a you can have poetry written and in different people it it resolves a different response. And I feel like sometimes that’s the same
12:36 Way that large language models react when I talk to it. like if I’m not descriptive enough, if I’m not specific enough, if I’m not clear enough, the same statement might resolve multiple different results to the large language model. So, it’s so I thought to myself, he said that and I was like, okay, interesting. Yeah, poetry. Okay, you’re being just a little bit too poetic with the words here. But as I really sat and mold over what he was describing, yeah, it really is. It’s like writing poetry. You’re trying to write the the right
13:08 Kinds of words to communicate correctly with the large language model and get the desired output or results that you want from the large language model. Isn’t that I thought that was really interesting. freaking thousand%. So there’s a few local projects I was working on, but the context is so like it cannot be understated how important feeding a context is. So for example, like there’s and there’s some really like honestly as much as you want to invest in the cool tooling, it’s looking into how do you actually feed that information to those AI models. So like for example, I have
13:43 A get or an extension that on GitHub any repo will basically just convert the entire codebase to like a text file with all the files and everything. Sure. That’s what I feed it things like, hey, take a look at this. This is an example thing. , so that is such an important part and again this you look at across the board this goes back to our conversation around AI prep and co-pilot. , with AI prep, you only have you have the data but the data is not a training.
14:15 What ? That’s what they’re trying to evaluate. Yes. You don’t have that back end of like true false thing like training data data or what’s right what’s wrong. And you only have a text box. That’s just the only context you can give it and we already know with sample data with like basic data even like whatever the type of generation is it needs a lot I would agree I’m getting to the point right now we in our company I run a consulting firm that does app development and consulting around
14:46 PowerBI the PowerBI things not so much but when we’re talking about app development or function development or visual object development things like that things that are very wellnown I’m lighting up co-pilot for all my team members. GitHub Copilot, I’m just buying it. Everyone’s going to get their own version of GitHub Copilot. It’s going to be part of the like part of our team. And right now our internally we’re all trying to learn what’s the right way to use a co-pilot. So we’re actually, , it’s it’s more like the whole team is trying to figure out what is useful. And what we’re finding is doing giving
15:19 Copilot very small tasks, lots of little ones, and working on specific problems or issues is extremely helpful in helping everyone be able to produce code. , we’re now using GitHub copilot code reviews. So, every time we do a check-in or a pull request, , when we go from the branch that you’re developing on to to dev, we’re always using GitHub Copilot to do a review. So we’ve we’ve changed how we’ve have handled our review policy on code. branching to dev goes through GitHub copilot and it catches a
15:54 Lot of things like hey this logic is redundant. you have a double negative here. Are you sure you want that? Like it it does a lot of like finding of like little little things and then it does a really good job commenting them so you can go back in make another commitment commit and change those things and fix them. But then when we get from dev to test, that’s where we like can start involving like people, right? That’s where we start saying, “Okay, let’s get the the senior engineer developer to step in and do the review.” And then when we go to test a prod, , we’re having another review there to make sure that there’s nothing that’s, , everything’s still good and we
16:25 Can still deploy correctly. So I I I really do feel like this is actually permeating our workloads and it’s changing how we build things. On the other side of things, I’m also trying to look at AI. Where can we apply it to do sales and marketing? How do I did it to do help it communicate like what we’re building and how how to communicate this out to our audience and team members? , it’s all you. Have you heard of cloud code? Cloud code. Yes. Yeah. So, there’s a ton of like this whole idea around the agents too. So, code replet lovable
16:59 Codeex thing. So, yeah. There’s again I can’t keep up. Every other day there’s like another one. there’s another agent showing up in a code solution that’s like now you have Grock fast code like that’s another one but now you can get that in VS code like all these different things are all over the place and they’re what I think Tommy is I think we’re going to get very specialized I think these agents are going to get efficient lower cost and they’re going to be very focused on a specific thing someone was telling me data centers are going to require in order to run all this AI stuff because of the GPUs and
17:33 The heavy amount of consumption of like GPU usage. Someone was telling me that data centers are going to need to be able to to use I think they said like 10 to 50 megawatt no gaw 10 to 50 gawatt of power to run these things. I think we’re going to run out of power pretty soon. For context like a gawatt runs like a little tin a tiny city, right? So, , 50 gawatts of power running through these data centers is enormous. We we need we’re going to have to figure out cheap sources of energy that don’t involve burning fossil fuels all the
18:07 Time. And it’s also the amount of water consumes too. So yeah, there’s there’s a ton of that. So Well, I don’t understand that. Why do you need wa why is water being consumed to cool? Yeah, but I understand. But like where does it go? Like are you literally turning it to steam? Is that what we’re doing? We’re steaming No. So it’s Mike. Think of a normal PC like a powered PC. It’s either going to be fanned, but the best way is to have a coolant water cooled. Sure. Yeah. Do you remember when we were at Ignite last year? Do you remember that? Remember they just they were just they
18:39 Had an entire display of a GPU rack. It was art in the art section of Gerard. Exactly. I was writing I was writing poetry to it. Oh, ChachiT. Oh, Chad GBT. Thou art thou cha. Yeah. But , that’s the whole idea, too. Like, it’s got to run somehow. So, there’s got to be a better way. Someone’s got to figure out a better way of like just moving the heat away from the processors out of the data center without having to use a ton of water. That just seems ridiculous, Tommy. Like,
19:12 Why do we need to consume water to cool our things? And why is it consumption? I It feels like that should be like a very simple closed loop something or another. I I don’t I don’t quite get it. Not sure if I understand quite yet, but I guess that’s why people are trying to drop like data centers at the bottom of the ocean. I think Microsoft did this where they dropped a whole data center at the bottom of the ocean cuz it’s like infinity cold at the bottom of the ocean. You can’t warm it up. It’s just too big, I guess. Maybe. I don’t know. I cannot wait to see that blog article introducing Atlantic data center. It’s not It’s not one lake anymore. It’s
19:46 Right bottom of the lakes data. Bottom of the lake. Oh gosh. Well, I want to see. Yeah. But what’s that approach that you actually do? Is that different that No. Yeah. Yeah. It’s just it’s just colder data. We put it in cold storage. all the funny terms. Anyways, okay, enough of our bantering about the beat from the street. Super fun. I definitely would recommend people go check out Transitical Task Flows. Very neat. and it only costs you cus. You don’t have to go buy another license. There’s
20:18 Not another app. You can do rightback. A lot of things that you wish you could have done are now being directly handled through Transitical, which is pretty dang sweet. Okay, that being said, let’s get on to our main topic. Tommy, want to do our honors here today, Tommy, and give us the main topic? So, our main topic is mailbag. Do we use composite models or reusable data sets? Are we sending mixed messages? And here is the mailbag. Whoa, I think I’m triggered after listening to both your composite model episode and your managing multiple data
20:51 Set episode. Isn’t a fair bit of the question fairly similar? However, much of the discussion seems different between episodes. In one sense, the gold standard seems to be thorough with data modeling, data set development to build a data set semantic model that can be used in multiple thin reports as needed. But the use of composite models seem to be somewhat discouraged. Even though this is another way to reuse
21:24 Key semantic models to answer business questions without redoing this exact same transformations. Mhm. It was alluded to but not explored that fabric has better ways of doing he put a question on that. It was alluded to but not explored that fabric has a better ways of doing this correct like a data warehouse or lakehouse. Can you expand on that? Thanks for the podcast. Even though I consider myself a skilled at PowerBI, I’m constantly learning from you guys. Oh, that’s how you get on. That’s definitely a lie.
21:58 That’s how you get on the podcast. So, a little sweet nothings. Exactly. Sweet nothings. You’re writing poetry to my ears as we slap poetry. So, so this is a this is a great topic, Tommy. I think this is actually very relevant. , and I I would agree, Tommy. I think the I think when composite models came out every there was a lot of like fanfare with what a composite model was. I think we should probably define what we mean by composite models and what we mean by golden data sets or or data sets that help define things in organizations. So
22:33 Tommy want to take a swing at doing composite models. Yeah. So composite models I cannot tell you the year that it was came out but rather than doing a live connection to your semantic model which has been available in PowerBI obviously we know the limitations can’t really build anything off of it and that’s the data that’s in there is the data that you have composite models allows us to use a mixed bag of analysis service or the live connection to a semantic model pick and choose actual tables from that semantic model with other sources that
23:06 Are usually imported. So I can have a single PowerBI report that is coming from my gold semantic model while also an Excel file that is connected to SharePoint. That shareoint file is not part of the original semantic model but in this new what we call composite model there does a mix of more or less direct query or direct query analysis services and import data. Yep. So bad. Yeah. So, , I think that’s a great summary here. ,
23:39 I believe, again, I’m looking at chat GPT and people are trying to write about it and trying to pull things. So, again, your mileage may vary on this one. This may not be exactly right, but , you don’t need to have TPD for this now. Come on. Well, there’s a little bit of nuance, I think, to this one. So, Microsoft released, and Tom, you’re going to be shocked when I say this. Microsoft released composite models in 2018. What? I know. I know. Okay, let me caveat this. Composite models were released in 2018, meaning you could have mixed data sources, meaning it didn’t
24:11 Have to only be a single source of information, right? I could go from SQL server to another another system, right? That was when the composite model was released. However, direct query over analysis services, which is part of the composite model family, wasn’t really released to 2020. So you couldn’t you couldn’t have a semantic model of model oh until 2020 I got. And so Tommy you’re you’re in the same boat as I am. We’re both thinking like
24:44 Oh a composite model is the direct query to other PowerBI semantic model. So a d a direct query to analysis services wasn’t added until way later. So that’s that’s where I’m going. So it was originally released in 2018. it wasn’t updated again in 2020 when really what we’re talking about here which is hey if I have dimensions or measures or calculations I can put them in a single model and then reuse that model again for other places okay there’s a so that’s the composite model side I want to unpack this other side of this
25:18 Question which is reusable data sets I think we should define those and since you did the composite model Tommy I’m going to a little bit to reusable data set and then I’ll also I I I want your opinion on tell me what you mean what you think it means to you as well. So in a reusable data set we want to have a single data set serving as many thin reports as we possibly can. However having one monolithic data model we find becomes cumbersome difficult to update and manage. And so I want to revise what by like a reusable data set. Yes,
25:52 You want to have like a sales domain, right? There’s sales information that the sales team needs. You’re going to build in tables. You’re going to have them interact with reports. There’s like a domain of information. You will probably have other domains of information like operational data. Well, what we’re talking about here is I’m not saying you shouldn’t put the operational data in with the sales data domain unless you really need it on the report where it joins both topics. Okay? So when I talk about reusable semantic models, I’m not saying, hey, your company should only have one or two
26:24 Models. That’s it. You’re done. Move on. Right? The larger your organization becomes and the larger the models become in size, you’re going to run into limits of you can’t keep all the data in memory all the time. So you’re going to need to build domainspecific models. And again, this is also another thing around like governance and administration. as your team grows, if it’s one person, much easier because you have one person who understands the model, they built the measures, they can make it happen, right? But Tommy, as you and I know, whenever we start working together on a project, Tommy’s doing a
26:55 Little bit of something somewhere, I’m building something in a different spot. We got to like coordinate and talk and connect. So, when we try to build things together, there’s this idea of we need to be able to break apart the workload and let Tommy work on the operational reporting and Mike should be able to work on the sales reporting. And so this is where the domains I think make a lot of sense here. Domain specific reporting. So that’s what by reusable data sets, right? This is something that you’re going to you’re going to give people a report. The expectation is
27:27 They’re going to take that report, maybe copy it, make new changes to it, adjust it, or you’re going to give them a data set and say here’s a semantic model. Go build what you want. , go build your flat table in Excel. go build pageionate reports. Go export things you want. So, you’re building self-service into those reusable data sets. That means you’re adding descriptions. You have lots of clean, clear measures. You’re organizing the information in an easy to understand way. , you’re communicating these are dimensional things, things I’m going to cut the data by, and these are factual things, things I’m going to count, aggregate, or sum,
28:00 Or do some math to. There’s all kinds of patterns and how you do this. But, , as you, if you’ve been around the data space long enough, you start recognizing some of those patterns. Let me just pause right there. Tommy, is there anything you would want to add to what I’m giving around like a data set a reusable data set definition? So with the reasonable data sets just from the clarity point of view this is I and I think for a lot of people too it’s the idea has always been the gold model thin reports idea but obviously we know there had those
28:33 Limitations and man as a consultant you see those limitations like hey can you make a quick fix here and you look and it’s 175 tables in one model probably oh that’s so and you’re like if I adjust if I adjust anything. I don’t know what that’s going to impact. I’m very Yeah, I’m very worried. There’s a birectional relationship here that no wonder something’s slow and you’re like, if I fix that, that’s going to break everything. Oh, my friend. And they want to be just like, can you just fix that DAX real quick? I’m like, the dependencies. Yeah. So, I I find when you have a lot of tables
29:04 Like that, it’s a lot of people coming from a transactional world trying to build a transactional database inside PowerBI with semantic models. And transactional databases just have a lot more tables. It’s just going to be a lot more. Yeah, it is. That’s how they build it. It’s easier for the app to read and use, but it’s not easy for us to do reporting on it. We need aggregations of this stuff like this is not good. Oh, the the sub product table all related. Yeah. So anyways, but and again it’s just the the business expands the questions get usually especially in
29:38 The initial adoption the questions and what’s possible just expands and it’s you can’t just put everything in a nice so to speak box of here’s our digital marketing semantic model and that’s everything they’re ever going to need. So being able to connect the dots across the business is huge. , so there’s a lot of use cases here, but I’m I’m going to pose something to you, Mike, regardless of each because they both have their limitations. And I don’t want to just start on limitations or the gotchas.
30:11 But I think it’s just really important to know if you use especially composite models. Yeah. What you’re getting yourself into. And again, I hate I hate starting with a, , buyer beware, so to speak. However, yes and no, Tommy, I agree with you to some degree, but like I think the one big negative that I see for composite models and the reason why I don’t use them very much or we’re heavily cautioned against them is when we were sold the initial idea of anal direct query direct or sorry direct
30:45 Query over analysis services which is part of the composite model. So a model of models, right? I have one main analysis of this model and I’m going to go grab other models across the board is that when the models talk to each other when one when model A talks to model B and says hey give me some data any filter context that you’re providing is is not able to be handled inside the compute engine because you you’re basically running two separate computers and you’re having to communicate between them is in my mental model that’s how my my it may not actually be doing that but
31:16 Like in my mental model that’s how I’m thinking about it right there’s two separate computers and then over the wire or through the network or through the the system, right? If you select five product numbers in model A and you need those five product numbers to get filtered to do a filter effect on another table in a different model and this is where Microsoft introduces this idea of islands, right? A single model is a single island. Second model that you have that you’re doing the composite model with is a different island. So, as long as you stay in the context of the
31:49 Island, your queries run really fast because it’s all inside one machine inside one memory storage compute item. As soon as you jump between islands, it’s like doing a shipping container, right? Everything’s fast when you’re in in the country, right? You’re shipping things around. It’s all working fast. And as soon as I want to send product from one country to another, , from Europe to America, I have to put it on a boat. I have to take all the items for each item, put an item for a list of all the items, put on a boat, and then
32:21 Transfer it over to the other the other island. And so because of that, there’s this slowdown effect. And so if you imagine if you had, , for five or six items, it’s not a big deal. But imagine Tommy, you selected everything except this item. Well, now you got this huge list of like really long items. Like I’ve got a massive list of text string that is getting sent to the other model. And we know the models don’t really love filtering on text strings. It’s not really super efficient. So we see in situations where the islands are trying to pass lots of data between
32:56 Them, we get very big slowdowns and so the composite model I think for me the story falls apart. Right? If you have a semantic model and you want to add an Excel document, if you have a semantic model and you’re going to do joins on like something that’s very high granularity, meaning there’s I guess I would say low granularity I guess would be the the word I should probably be using of very few items, right? Join on region, join on country, join on something that’s a bit more aggregated in form, then you’re probably
33:27 Fine. Composite models will probably work just pro no problem. you might find good reasonable use cases for this. Kevin Arnold did a really good article and has been talking about composite models for a number of years and has some really good points around that. So I I said a lot of things Tommy that’s my big downside to composite models. What do you think? Is that your same impression too? Do you have anything else you want to add? Yeah and I think the where this really comes into play is and I’ll always go back to manage self- service right and this becomes such a huge part usually when the question really arises on how are we going to tackle this as an organization because it also becomes
34:03 Not just a feature at this point, it’s a governance a governance question. So even if let’s say there were no bugs or limitations and yeah they they actually have worked on the data flowing or filter context flowing through different tables initially that did not work. but however if we roll out managed self-s service which a lot of organizations are well we’ll see what happens with fabric now but it really has been to me the shift for a lot of organizations where we cannot survive on just this single BI developers alone but
34:38 We want to keep our data secure so we you and I have talked in spades about MSS or manage cell service this is usually when this comes up it’s like okay central team you built these great marketing semantic models that I can connect to in the semantic model workspace but I need to actually see that that email sends too so that’s a great use case where composite models come into play I don’t want to say outside of comp manage self- service but really when you
35:10 Allow or teams and users who’ve gone through we’ll say the proper training or the prop the the proper channels to add in that additional data to a semantic model and that’s also controlled in something like a mini self-service. So that model cannot be published in somewhere where it can go to the entire org. It’s already known. This is a composite model. This is where it makes a ton of sense. This is where it actually really I’ve seen it take off. because it’s also who owns it, right?
35:43 Because you want you can’t say as a the developer well I built the sales model and now every report that’s composite is wrong okay who do how do we make sure that blame in a sense goes to the right person or the right team yeah responsibility that’s that’s where I really want to use like promoted and certified data sets makes a lot more sense here because then you can then you can restrict who’s actually able to certify things so I I think it it expands more than just certified and the promoted though because I think even in the
36:18 Manual service you’re going to have a lot of semantic models not all of them are going to be certified you just may have you just very few of them will be certified I would exactly exactly if you’re giving people models that are certified and saying build more things on top of these right I would assume almost none of those things that are being built off of those certified items are going to be certified in the fact that you’re now handing off responsibility to a different team. The central BI team should be the one building those certified items. And then if there’s something that an item, let’s say Tommy, you come with this amazing new algorithm
36:50 And you this amazing new report. It’s doing really impactful things. If that’s something that’s adding legit value with your composite model or whatever you built in the composite model, me as central it would say, let’s go look at the usage of Tommy’s report. Wow, this is really performing really well. We’ve got a lot of hits. It seems like it’s gaining some traction. what can we do to incorporate his changes into our main model and deprecate what you built but then make it more efficient and faster in like the central space and then and then you’re going back from like you’re basically killing off the uncertified things and bringing them
37:22 Back into a certified space again. Yeah. And I think that the roll out there is so necessary. As much as you want to just like my initial nerd in me wants to have all my semantic models have the ability to be connected to other sources because that’s what I like to do. I look at something like oh if I wish I had this but the roll out’s so important here for it to usually work out well because again those gotchas. But let me ask you before I I say the same thing over. Would you agree or disagree with the statement then that the
37:57 Composite model route only makes sense in a managed self- service environment? That’s a great question to think about that. I think I think I’d have to answer it with a are you only playing around in powerbi.com or do you have access to fabric? But let me clarify the question. Interesting. Interesting. What do you think? So let’s say let’s let’s let’s go with this standard because I know the fabric’s going to change. Let’s say just PowerBI. Okay. In just PowerBI, I I think there is a use case for some lightweight composite models to be used in
38:31 Self-service. I do want to caution like there’s definitely a big warning label that comes with it. Your performance will vary. You may get fast things if your if your model size is small like under a gig. Mhm. Composite models probably won’t even you probably won’t even bat an eye, right? If your data set’s small enough, you probably won’t even notice. we find a lot of times when companies are starting out building PowerBI models, the speed of the analysis services engine is so fast that you can get away with a lot of bad designs and it still works just
39:04 Fine. And again, it’s very forgiving in that way, right? usually I get called when they’re like, “Oh, my model is not fitting on this server or oh my model is too big or oh this that and the other thing.” So, like, how do I trim this down? Oh, okay. Well, you loaded every single column from the SQL database. You’re not actually using every single column in the actual report. You probably need to ditch these other text fields or columns cuz you don’t need them. Your model’s too large. Or you’ve built this one monolithic model that is all reports, all pages in one big
39:36 Semantic model. what? You now have broken the threshold of one gigabyte. you now need to break that thing into domains like we said earlier which is now build domain specific models and then it it gives you a lot more flexibility there as well. So I I will say in the PowerBI world I think there is a use case. Again I would highly recommend go read or go listen to Kevin Arnold’s go look up Kevin Arnold composite models best practices. He’s got like five use cases where he thinks it makes sense and I would agree with them. They make a lot of sense to me as well. So that would be one area when we move over from just pure PowerBI to now
40:10 Microsoft fabric. I like this new world that we’re living in. And Tim I think also is slightly shifting how I view model development as well. Okay. Okay. In TimL you have the ability to see the entire definition of the model as flat files. Also inside timle you can reference other files inside timdul. So each table becomes a file. In that table you can have all the measures of the table. So one of the reasons one of the arguments for using
40:41 Composite models was I have this table these are the measure that these are the factual measures that live in this fact table I need to lift that somewhere else I had to rebuild those measures again right I want to reuse the business logic so there’s just really this concept of like the golden data set now doesn’t actually have to be a deployed model the golden data set could be a semantic model that has all the definition and You save it as PBIP formatted and you take the timle file for fact sales and
41:15 You can then lift that to other models or in other words Tommy 2 you can take an existing large semantic model which is the semantic definition of the entire company and then you can just delete tables and get a and get a smaller domain. So now you can build automation around this. Now you can build deleting files and making things more basically think think about the the semantic model. How great would it be to be able to take this one golden definition of a semantic model the schemas all the relationships all the definitions the tables and just delete
41:48 Off what you didn’t need and then make one main model makes four or five smaller subdomain models. Yeah that’s possible now and I think that wasn’t really possible before. So the other thing I’ll point out here when we go into the world of fabric is we have this idea of islands. Remember we were talking about that earlier. Oh yeah. When you go between lakeouses. So let’s say I have lake house for sales and I have a different lakehouse for operations. When I pull data from the lake, it’s the same island. We have no problem now. So this is another thing that I think
42:21 Changes for me in the game of fabric is data sets now have less islands because I can go to the lakehouse and I can define multiple tables in multiple different lakehouses but everything still acts as if it is the same island inside the PowerBI semantic model. And again now we’re getting all the the efficiency the improvements of data sets but we can still talk to those tables collectively. So my opinion here is if I had to choose Michael, you need to use composite models or you can use
42:52 Lakehouse and tables or warehouse and tables. Whatever the lakehouse, warehouse, I think they’re they’re very synonymous depending on what your skill set is. I’m going to choose lakehouse and warehouse with tables because I get the advantage of everything’s on the same island. I get the advantage of everything can be loaded directly to the semantic model. , I can I can now build relationships, tables, and measures and reuse them using TIMDLE, which helps me build models faster. So, that’s where I’m thinking,
43:23 Tommy. And and I think this is where everything changes for me, too, is the day that Microsoft announced that you can build a directly semantic model with multiple tables for multiple lakeouses. That’s when everything changed. Game changer. No, I I think that that absolutely goes on the t-shirt as as a game changer honestly because getting that implemented it to me would be like in an ideal world of the first approach because this that completely changes completely changes the last 10
43:56 Years of how we thought about developing models. Yes. Now, and I say that the reason I caveat that with the ideal is because here’s the thing just I’ll just throw it out there too. Mhm. The beautiful thing about a PowerBI sim model is I can have this on desktop and again it’s very so to speak easy for someone to recreate or just use that PowerBI file. Once we start dealing with the tables from different lakehouses and then trying to distribute that.
44:29 That’s when things I think get a little a little more complicated. However, if I’m a central BI team, oh my, I’m never never you doing Power Query again. Almost. Gosh, I hate saying that. but the the thing is I because all like what I use data flows n 95% of the time that 5% though was awesome for fact tables but 95% of the time was for master tables things that we’re going to use and re-reference and re-reference in the models.
45:03 Yep. Now that’s the usually not only the only limitation with that was the data flow had to refresh and then I had to refresh my model get anything updated. So dimension tables made sense. But with this all my fact tables can live in different like again in the environment in the location that I want and we can have those single lake houses that are maybe a managed self-service approach but regardless we have our single here’s our master tables for dates and customers etc. And the beautiful thing
45:38 About that too is any expansions or extensions of those tables easy to do. so users are almost going to like the the buffet version of picking a model. We have obviously organization choosing becomes important here. Yes. But this to me once this start starts really getting rolled out is without being a little understated the most the most consequential to how we develop PowerBI models and PowerBI
46:11 Reports that has happened since PowerBI came out in 2015. Interesting. Tommy, I’m I’m You’re pondering. I’m pondering right now. your your comment there made me so while you were describing things I’m gonna just go on an idea imagine a world right imagine if okay so imagine a tool that would go through all of your semantic models it would find all of your lakehouse tables
46:45 And then it would build relationships between okay we know that these tables are referenced in these different power query experiences right these different semantic models downstream name, right? So now you have So imagine first we start at the lakehouse level. We now have all the lineage of the lakehouse tables. Then imagine a tool that would go through all of your semantic models and define every single table from every single model and it would show you the lineage from this lakehouse. This is the name of the table. That table then is referenced in this power query here.
47:17 These are the steps. this this table was transformed with these three steps of transforms and then that is used in these are the measures that attach to that there’s there’s a view of this like so I was talking about earlier like the the golden semantic model like for your whole organization you could build that that’s that’s something that’s buildable today and you don’t have to build you don’t have to start with the golden semantic model you could start with your lakehouse tables and then all the currently deployed semantic models and say where does all of my data does in my organization come
47:48 From? What server does it come from? Where what’s the path of it? What’s the most to me this is like a graph database. This is this is a a thing of a bunch of relationships that we’re able to produce together. So imagine having like say Tommy you’ve got 30 or 40 semantic models across your organization that you’re looking at, right? Every single one of them have references back to some data source. So, companywide, you should be able to say, “Here’s all my data sources across my entire organization. Here’s all the tables that we’re producing from those.
48:20 Here’s of those tables. Here’s all the semantic models they live. And here’s which semantic models did transforms on top of those tables. Oh, by the way, here’s the complete list of all your measures across all of those tables. And here’s the relationships between all those tables. If you just took that information just of itself, all of it links back together. all of it’s related and you could have a centralized graph database of scan my tenant see all the items we have and then you could say look here’s our production SQL server you could
48:52 Literally click on that one and you could say all the way downstream here’s all the tables here’s where things are being transformed cuz I went through a project recently where we did this we had let’s call it like seven or eight maybe more than that 15 or so reports every report was being redesigned differently. Every report had a single semantic model and they were writing custom SQL for every single report to get the results out. Shockingly, nothing tied out together at the end. Nothing matched at the end of the day. And so since nothing was matching at the end of
49:26 The day, that’s because every single semantic model had different business logic. How cool would it be to be an admin and say where is business logic being applied to this table that is a source and what changes are being made in other places. It would be incredible. the the data the the part of this is that’s blowing my mind right now is it exists. It’s in there. It’s in the mod. It’s in the computer. It’s there. All we have to do is reach it, man. It’s in the computer. So, it’s just asking for us to get it, man.
49:57 Yeah. I just got to write some poetry to get it out. So, , so anyways, your your comment there, Tommy, like, , I think in general here, right, what we’re talking about is like the world has opened up to us like composite models, I think, was maybe like pass number one of this lakeous fabric, , the extensibility of the PBIP format. I think that’s really opened a new door for us. And that’s why I think Tommy and I would be recommending if we had to pick one or the other, we would choose data sets and data modeling over top of the composite models. And and it’s more
50:29 Around the idea of the the world of fabric has opened up the capabilities for us to do more, discover more, and see more with inside that that multiple data model arena. And that’s maybe that’s maybe that’s a better way of saying of saying this one. Yeah, you can use composite models. Yes, it works in a pure PowerBI world. But if you’re even thinking about or starting to use fabric in any form, you really should be considering redesigning your models or looking at what your models are doing and start saying, “Okay, let’s look at the domains of all the models and finding out the
51:01 Full lineage of everything.” There’s your million-dollar idea there, Tommy. You just got to start working on it this weekend. How about code it? Vive code it. there’s a few agents that I’m gonna have to write. No, but I we’re gonna have to I think there’s going to be a lot more episodes on this, too, because it’s interesting, though. We’re talking about this. We love this. Mhm. And maybe people just don’t blog about their internal environments anymore. I don’t know. But I’m I have not seen a lot of buzz about this approach, which
51:36 Is interesting. Right. So like the blog article came out and we’re like we’re talking about what is possible here with being like just open like what the you just talked about all the doors that have opened up. I’ve had no followup from Microsoft. people I talk to don’t see like to me this is one of those hey everyone’s been riding horses by the way there’s this really cool thing it’s called a car and we’ll go wow that’s really awesome giddy up and they go back on their horse there’s there’s almost a little of that
52:07 To me I don’t know if that make a lot of sense where if this does what it says it can do then to me this would be where like the times focus from Microsoft’s point of view to other MVPs to organizations saying no this is this is this direction we’re moving. I think I think I would point out the I think what I feel like is the main hesitation right now is really around what Microsoft is providing for licensing at this point. I think the
52:40 Main hindrance to this being like the new standard is because all those prousers don’t get access to fabric. And I I would argue to believe you look at your organization who has access to fabric. It’s probably a much smaller surface area of people than what we have for just using PowerBI. Like you look at the PowerBI ecosystem and again this is numbers as of like March of this year. I think Microsoft quoted something like 30 million monthly active users in in PowerBI. So that’s a huge number. Now, Microsoft has also touted that the
53:13 Growth of companies using Microsoft Fabric has been increasing. So, like that’s that’s way higher now that I don’t know the actual number of that one. I don’t remember any quotes from Microsoft around how many organizations are now using Fabric, but it’s growing and it’s growing. It’s the fastest growing data platform they’ve ever had, which makes sense because it’s it’s easy to use. It’s right next to everything else. They’re constantly prompting you to like get a trial, start it, get going, right? So, a lot of this stuff is just making sense. And I think to your point, Tommy, I think there’s a hesitation. I think people aren’t are a little bit leerary of if I go into fabric, there’s
53:46 So many options. I think people are just nervous about getting the the decision wrong and having to rebuild a bunch of things. Yeah. And so organizations are like, we think fabric might be the right solution, but we’re hesitating right now. So I I I think there’s just a hesitation from organizations to not step into yet another brand new tool to see if it’s going to actually add value. So what are we are 3 years out on fabric Tommy? Two years out. I GA I think was just a year and a half ago. So we’re only two, let’s call it two
54:18 Years in. We’re only roughly two years in on fabric and we’re just at the beginning of the adoption curve of this thing. So if this just keeps growing, I think in year three, year four, year five, you’re going to start seeing a lot more comfortable organizations saying this is solving problems. We’re finding real value from fabric and we’re going to start adopting it faster. I think there’s this idea of like I know what the cost is. I have pro users. I have licenses. It works. The fact that I have to pay a different licensing pattern to get into the fabric space, I think has some organizations hesitating. The bigger ones, the ones that already have
54:51 P SKS and like big organizations, they’re already in like they already they’ve already bought it because they already by default your P SKS are already fabric SKs anyways. So I I already see that being like a no-brainer for them. If you’re large and you have a psq, you’ve already made the commitment like, “Yeah, PowerBI is going to be worth it for us and now you can add all the fabric stuff as well.” Interesting. Yeah. Well, and and I think that’s a lot of things right now that’s on hold. Like imagine this happened to us with PowerBI. The only thing that got released in the last three years was the the have your tables easily connected to how whatever that’s
55:27 This is the one feature. I imagine this would be the talk every week, every month in terms of what people are doing if we had no I know that’s odd to say but like there’s no notebooks and no nothing else. This to me seems like the biggest game changer but I think I I would agree with you Mike where yes absolutely this is a huge deal but how many organizations actually have completely moved over to it right now? I don’t think it’s a complete move at all any by any means. I think there’s also a little bit of political diagnosis going on here. Right. So the reason I there’s some political challenges here.
56:00 Some organizations have an dedicated data engineering team. They’re they’re maintaining warehouses and databases and things like that, right? So you need to transition that team away for like are you going to stay in SQL server on prem? Are you going to stay in some other Azure related field? Like a lot of companies have already put together a lot of technical debt around making sure these other data products that they’ve built work to throw that away and start green field again with fabric that I think is a is a mistake honestly. So it needs to be these decision points where
56:31 Organizations are saying I need to move away from onrem. I need to move to cloud. What is our next option? And I think those are the companies that are like okay let’s adopt fabric because I don’t have to stand stuff up. I get one one purchasing place a capacity and then from there I can build anything I want. I think also Tommy organizations that are currently on legacy products maybe already in Azure and they’re trying to modernize themselves. Hey, we don’t want to be on SQL servers anymore. We want a different way of doing data. At the end of the day, the reports are always going to be in PowerBI. It feels to me like that’s
57:03 Going to be a staple, right? That’s that’s going to be the one consistent thing. And what we’re what I’m finding now is because PowerBI is the anchor to a lot of this story, organizations are exploring. Okay, well, what could we do to make it easier? Oh, interesting. Direct Lake means I don’t have to import data anymore. Well, that would save me an hour every day on every single semantic model. That’s nice. I’d rather not to have to import everything all the time. Oh, I need more real time data. What things does fabric offer? And I’m saying real time in the fact
57:35 That not I have a stream of high velocity data. I have a slow stream. One file appears in my lakehouse every day. I want it to automatically load when the file arrives. Right? It’s that stuff. It’s I want real I want eventdriven data loading pipelines. That’s a need. That’s a re that’s a legitimate need. And that is real time data, but it’s just doing it when something shows up. It real time does a data load pipeline. So, it’s that stuff that I think organizations want to start using and now fabric is making
58:08 That easier. You could have you could have done all this with other tools. You could use data bricks and snowflake and I think Microsoft’s really coming in strong now. It’s starting to get its feet. It’s making a lot of improvements and they’re being really competitive against data bricks and snowflake. And you’re seeing dataf flicks and data bricks and snowflake or dataf flicks. That’s like snowflake and data bricks all mixed to one. , data bricks and snowflake is now trying to build semantic models. They’re building in inmemory tabular models so they can cache the data and compete with PowerBI
58:43 At the report layer. It’s coming. Like the fact that they’re building what Microsoft has already had for 10 years, 30 years, whatever it is in in Excel, just tells me that Microsoft’s on the right on the right pace here. They had the right solution. Dude, I cannot wait to see. Here’s the nice thing, Mike. Regardless, it feels like we’re going to have a job forever in this space. I’ve said this before with you in the past, like data is not getting they’re not throwing away data. Where in the world are we having like less data being created? If anything,
59:15 All the AI and automation, it’s just creating more data. Like, we’re exponentially growing the amount of data, Tommy. I think as long as we can stay in tune with it and work with it and and adapt, right? You’re right. I think you’re right. I think we’re in a place where there will be many many years of data engineering, shaping, developing, and building as long as we stay in the forefront of like what what Microsoft and the big companies are doing to help organizations manipulate and shape and store data. Dude, I I love it. So hopefully we
59:47 Answered the question, too. But Mike, it’s funny because obviously for you and I, we’re all onto the bandwagon of fabric. And I think and I think the person asking the question here is probably like, okay, okay, but here’s the I think for a lot of people who are listening, it’s it’s going to happen for you whether you want it to or not. This our move to fabric. So I cannot wait to see what the future holds. Every single week, Microsoft is building new features inside Fabric that is just making the
60:20 Data story more compelling. It it it’s getting to the point where the value proposition or the opportunity of value inside fabric is going to be is becoming so great. Why wouldn’t you just try it? Do see if it can make you go a little bit quicker. Can you take a project that used to take a month down to a couple weeks a week? Like that’s amazing. And also I’m going to be excited to hear the announcements coming out from FabCon Vienna. I believe there are some big announcements and maybe that will also help us push things forward. So next week when we actually live when we actually are back on
60:52 Live air again. Tommy, we should definitely do a recap of the fabric conference and just pick out our key announcements and things that we think are really exciting and new. So that being said, I think we we’re fully at time here. I’ve had a great time talking about this one. Tommy, this is a great topic. Nice thoughts. Great question. Our community is amazing. And thank you Mailbag for sending us a great question. we really want to encourage that from the community. It helps us direct our conversation to things that you care about and we love that. So make sure you engage with us and ask questions on our mailbag. We have a form on the website. You can submit your questions there. tip or
61:26 Trick. If you tell us you really like the podcast and how much you’re learning from us, we’ll probably pick your question more over than others. Not saying that’s going to actually happen, but , Tommy picks the questions from the from the the mailbag. So maybe you maybe you sent him some pasta. Tommy will take your question. Take your question. It’s got to be fresh. It’s got to be fresh. That being said, thank you so much. we’ve introduced something new on the podcast episodes. If you want to watch this podcast episode on YouTube with no ads, you’re more than welcome to. We have a memberships area. Visualize this
61:58 As one of our members. We’d love for you to be part of our community and chat with us and connect with us on that community as well. You can get this whole episode free of ads and early. We will post the episode as soon as it launches on the visualize this space on our YouTube membership place. So feel free, please join us as a member and support the channel. And if you’ve learned something from this, please become a member. We’re going to keep doing this. It helps us keep going and fund the the expenses that we have that do video and all this other stuff. So we really appreciate it.
62:30 That being said, Tommy, where else can you find the podcast? You can find us on Apple, Spotify, wherever you’re at your podcast. Make sure to subscribe and leave a rating. and it helps us out a ton. If you have a question, idea, or topic that you want us to talk about in a future episode, head over to PowerBI.tipsodcast. Leave your name and a great question. And finally, join us live every Tuesday and Thursday, 7:30 a.m. Central and all PowerBI.tips social media channels. Let’s make a quick correction there, Tommy. It’s it’s powerbi.tipsodcast, not powerbi.tips.com.
63:04 You can’t type that. You can’t type that. It doesn’t It doesn’t work. It’s only powerbi.tips/mpodcast. We’ll be there. The questions are there. Feel free to submit your questions inside there as well. we really appreciate community. Thank you so much. We really appreciate all the things you’ve been doing. it’s super fun to be around the community. have a great week and we’ll see you next time. Down.
Thank You
Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.
Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.
Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.
