PowerBI.tips

What Is Real-Time Intelligence? – Ep. 467

October 15, 2025 By Mike Carlo , Tommy Puglia
What Is Real-Time Intelligence? – Ep. 467

Real-Time Intelligence is more than just fast dashboards—it’s a fundamentally different way of thinking about data. Mike and Tommy break down the scope of Fabric’s RTI workload, how Eventhouses fit in, and what makes event-driven analytics different from traditional batch processing.

News & Announcements

Main Discussion: The Scope of Real-Time Intelligence

What RTI Actually Is

Real-Time Intelligence in Fabric is a complete workload built on the Kusto engine:

  • Eventhouses — The storage and compute layer for streaming data (successor to Azure Data Explorer clusters)
  • KQL databases — Where event data lands and gets queried
  • Real-time dashboards — Purpose-built for low-latency visualization
  • Data Activator — Event-driven triggers and automated responses

Batch vs. Event-Driven

The fundamental paradigm shift:

  • Batch analytics — Collect data → process on schedule → query the result → make decisions hours/days later
  • Event-driven — Data arrives → processed immediately → insight available in seconds → action triggered automatically

When to Use RTI

Not everything needs real-time:

  • Use RTI for IoT telemetry, application logs, security events, live monitoring, fraud detection
  • Use batch for historical reporting, financial close, planned analysis
  • Hybrid — Many scenarios need both: real-time monitoring with historical context

Eventhouse Architecture

Mike and Tommy explain how Eventhouses work:

  • Optimized for append-heavy, time-series workloads
  • Data is automatically indexed by time
  • KQL queries are pipeline-based and optimized for large-scale filtering
  • Integrates with OneLake for cross-workload data access

The BI Developer’s On-Ramp

For Power BI pros looking to add RTI to their toolkit:

  • Start with the RTI overview
  • Understand Eventhouse concepts
  • Learn basic KQL (it’s approachable if you know SQL)
  • Connect RTI data to Power BI reports for hybrid scenarios

Looking Forward

Real-Time Intelligence is becoming a core pillar of the Fabric platform alongside data engineering and BI. As more organizations generate event-driven data (IoT, SaaS telemetry, operational systems), RTI skills become essential for the modern data professional.

Episode Transcript

Full verbatim transcript — click any timestamp to jump to that moment:

0:00 Good morning and welcome back everyone to the explicit measures podcast. We’re glad you’re all here and hello. so a

0:34 Quick note here real quick. so we were starting a little bit late. We had some technical difficulties getting started right at 7:30 this morning. So just FYI, sorry about being a little bit late. We were having some technical challenges on our side. Computers don’t always like you all the time apparently. So that’s that’s a thing. Not always. Not always. Not always. All right. Today our main topic will be talking about real time intelligence. So that’ll be our main topic for today. going deeper as what does this mean? How do we unpack this? I think there’s a little bit of a misunderstanding of what this potentially could mean and how it

1:06 Could be useful in your business. So, let’s we’ll unpack that here in a little bit, but before we get there, Tommy, you have some news for us and and I believe Tommy has lost a bet. I This is This is good. I think this is good. We’ll see. There’s a lot of things that happened. So last week when we were on vacation doing some recordings, there’s a nice little blog article on the PowerBI update. Sounds great. And it starts with deprecation. No worries. Usually any feature or blog that starts with

1:38 Deprecation usually mean a feature is going to go away in about 6 to 8 months. That’s fine and gives us enough time. Well, lo and behold, the my eyes widened as it was deprecation of metric sets which have been around for a year. Never got out of GA. Yep. And this is insane. Not only that, Mike, this I’ve never seen any announcement or deprecation sets or

2:11 Deprecation notice with this timeline. Yeah. What’s changing? This is October 6th this month. Yep. October 25th, you can’t create metric sets. November 15th, they’re going to be gone from the full retirement. Gone. Full retirement. And Mike, to put just to put the layers on on October 20th this month, I was giving a session at Community Southern Orlando that’s now updated on metric sets with a full demo. Clearly, they don’t know what

2:42 Sessions they’re picking. , they obviously aren’t into This is the one who suggested it. All right. Well, so so this two two things come come to mind when I see this something like this happening. The first thing that comes to mind is I realize that Microsoft can’t support every single feature. And the speed in which things are being deprecated here is probably indicative of how many people are actually using the feature, right? So if there’s not a lot of people using the feature, it probably means you’re going to be able to deprecate it very quickly with minimal impact to the rest of the organization or how people are

3:15 Using that. One thing I will just note here, I’m saddened by this a little bit. I liked the idea of metric sets. It was focusing more on the measure and then all the dimensions that apply to said measure. I think potentially the roll out of this metric set thing wasn’t very ideal. maybe getting how do you how do you go from a metric set to like building multiple visuals? How do you go from a metric set to getting it into a report or how do we build from this a little bit easier? So I I think the idea or the concept was nice. It’s just

3:48 Disappointing to see it deprecated so quickly without any additional feature pieces on top of it. So I don’t I don’t really know why it didn’t get the usage or the love that it needed. I didn’t really hear no one was talking about it. It was exciting when it came out. , it seems to fit. I I like the idea of it. Here’s the KPI. Here’s the measure. And here’s the different dimensions that should be used or applied to that measure. That made sense to me. I think that makes you could reuse that across your organization. The story was nice. I liked it. This is crazy, Mike. And not just

4:21 Because I had a session that I was preparing for on this. It’s surprising to me especially because of the timeline the the quick turnaround that they’re gonna it never seen before and they’ve done that with a bunch of features but this is surprising because I did make a bet with I believe Kie Carly Carly at the Microsoft product ma product manager team and I made a bet that before GA two years after metric

4:56 Sets went GA, it would take over more views than reports. So, well, technically, did I lose? Did I lose because it never went GA? You like super, in my opinion, you like super lost because not only did the feature not even make it past preview, it never made it to GA. It never even got it so bad it wouldn’t even get to G. I never said it was going to get to G. That’s like me putting a bet on the Packers Bears game and lightning hits and there’s no game. Who know who knows? So anyways, on one hand, so on on on one side, this is

5:28 Sad that the feature is getting deprecated. I think the speed of which it is deprecating is because purely because there’s not a lot of people using it. Okay, fine, move on. On the flip side of this though, this does for me give me some hope because if you were going to maintain this feature or maintain this item, this just means resources can be reallocated to other things inside the PowerBI platform, which I hope they’re putting more people on the visualization side of things, which needs a lot more love and needs a lot of improvements and other other features as well there additionally. Anyways, so really cool. that’s a that’s another news item. I want to

6:02 Throw out another one here that I’m excited about here. Now this is not a deprecation of something. This is actually an enhancement on one lake. so I one lake I like this whole concept and and the data that’s coming out of the one lake experience. I’m also building a lot of products around data lake external sharing. So getting from one fabric environment to another fabric. stay tuned. You’re going to hear about a new product that we’re coming out with called data marketplace in the near couple months here. We’re going to start doing some announcements and some YouTubes video YouTube videos around what a data

6:34 Marketplace is and how can you go get it as a workload. So if you if you want more information, stay tuned. Make sure you subscribe to the channel. There will be more information coming on that shortly. The one lake API is now starting to communicate at the fabric conference in Vienna. Okay. huge announcement around snowflake and powerbi fabric one lake integrations are now becoming more seamless much easier to work with. So it says here starting with the Apache iceberg rest catalog you can now use the one lake

7:07 APIs for iceberg tables. So iceberg is another format of columner storage tables. It’s designed slightly different than a delta table. I think you have basically three delta storage tables. You have delta tables, you have Apache iceberg or iceberg tables, and then you have I think it was like booty who Julie remember there’s another there’s another version that’s that’s not quite as popular, but the two main ones are iceberg and delta. Data bricks adopted the delta format and then snowflake adop adopted the

7:40 Iceberg open format. They both have their advantages. They’re both super fast. , and now you can start using iceberg inside of one link, which I’m excited about and I’m looking to build around that. So if you use that, if you’re interested in something like that or you want to start sharing data using iceberg tables or a product that uses iceberg sharing, let me know. Reach out. So I am and dude, anytime there, give me more APIs. That’s all I know. I love me some APIs anywhere they are. So and especially with workloads, they’re essential, right?

8:13 It opens up a lot more doors for you when you’re building workloads. I’ve been building workloads since they came out. We’ve got one out now. We have another one in private preview called data marketplace. We have two more planned on the on the road map that we’re going to be building more workloads for fabric. I think it’s a great opportunity for companies to really leverage their technology and do a good job there. Anyways, enough said there around the two news items. Any other news items, Tommy, you want to discuss? , I think that’s everything, man. I is that all the news? There’s more stuff on the blog, but I

8:45 Think we’re just going to hang on these two items only. All right, with that being said, let’s jump into our main topic today. So, today I’d like to introduce a PM from the Microsoft product team. Chris or Christopher, welcome to the welcome to the podcast. We appreciate you joining us today. So, today’s topic is going to be all about real time intelligence. What is it? How do we hear this word there’s an icon in the portal if you’re a traditional BI developer on fabric or traditional BI developer in PowerBI you may not be very

9:18 Familiar with real time intelligence so Christopher welcome to the podcast we’re helping to happy to have you unpack this idea or this concept of what is real time intelligence what part of the product team do you work on yeah hi Mike hi Tommy thanks for having me looking forward to the conversation today so my name is Christopher Schmidt I’m principal program manager on the Microsoft Fabric Realtime Intelligence Customer Advisory Team. , which I’ve checked is not the longest team name inside of Microsoft. I have lost there. There are longer. , I keep looking for for more words to add

9:52 To our team name because we’re going for we’re going for the title. You please add the word agentic or AI like AI is something that co-pilots version of real time intelligence including agentic applications. Yes. Yes. Exactly. Yep. Amazing. Awesome. So, , you’re P on the team. So, I I think there’s a a little bit of a not a misnomer, but I think when when people initially think of or when I talk to clients around real- time intelligence, I think there’s this initial thought or feeling that I have to have IoT devices and I have to be

10:26 Sending real-time analytics from like a website. It has to be this always this high volume of information landing somewhere into the real-time intelligence platform. While this is part of that story, right, you totally can hook up and wire those things up. but I think there’s also another portion of the story and again I’d love your your your input on this one too as well, Christopher. There’s another story here where it’s more like event driven things. So, and and I’m I’m using event driven because a lot of times I’m working with like a backend system

10:58 That’s completing a process and I need to have something load immediately when it’s done like a a real time load of when the the data processing is complete. Is that what instant scope for you or are you focusing only on like high volume all real time intelligence data? You totally stole my thunder like everything I was going to say. Yeah. so it’s like when we think of real-time intelligence we commonly think of streaming data and high velocity streams and what it is that I’m coming from and while like you said that’s important

11:32 And that’s a huge part of some of the challenges that we deal with it’s not the whole story right I always joke that there’s a reason I’m not in marketing but if I was in marketing I would call it the fabric event driven platform not fabric real-time intelligence and everyone always tells me like Chris that’s a terrible name like no one no marketing person’s ever going to call it that. But well, what I I say marketing has done some pretty interesting things naming some stuff like transitical task flows. I’m not a fan of that name either. So let’s just be clear Microsoft’s marketing team isn’t like hitting you

12:05 Know bombers every single time they’re making they’re making a name up or something. There are a couple swings and misses. Yeah. Yeah. Absolutely. But I think it it really starts to tell the story of what real time intelligence really is at its core. and if if you remember when real time intelligence first came out it was really called real time analytics right? So like the very first iteration was real time analytics we changed that name and we moved away from that because it doesn’t really represent everything that the product can go do right real time intelligence is like when you really get down to it and you really understand the underlying reason why it’s called that it really

12:37 Starts to make a lot more sense. But it’s this idea that no matter how quickly I’m getting my data in and whether I’m getting that data in at a second level because I’m pulling data from a bunch of sensors and machines and I want to be pulling all that data in real time and ingesting it or whether I’m only updating that data maybe, , once an hour or once every 15 minutes or I only get that that operational process or that thing every so often. I still want to act on those things as quickly as I can. I want to move to the ability to be able to take proactive actions inside of my environment beyond

13:13 Just the traditional reporting style. Right? So I like that when we think of what we’ve seen from analytics over the last 15 20 years really as long as I’ve been in the business which it feels like a long time. , but the primary mode of delivery has always been a report, right? It’s like I’m going to give you a report and then you’re going to look at it and you’re going to analyze it and then you’re going to go make some decision on what it is that you want to go do. Real time intelligence is about this idea of I don’t necessarily need a report. I can

13:45 Have a report if I choose to, but really I want to unlock any of a variety of different action systems that I can use to go consume and act on that data. So it may be a report, it may be a real-time dashboard, it may be something like an activator alert where I want to go trigger something like an if this then that rule, right? It may be agentic AI application or it may be a conversational AI application. It could be all these different things that I’m pulling. Then I want to go see in real time and I want to go take some action, right? I was listening to your episode last week around the the email subscription, right? And I was I think I

14:19 Was telling you in the in the pre-show a little bit around like, , I I listen to you guys really regularly. I’ve listened to you for a really long time. So like when we talk about getting an email subscription, right, or we talk about getting a report, an email subscription to me is a symptom, right? If I’m emailing something, someone needs to go do something with that, right? And you could really say the same thing about any report. If I build a report, I’m doing it because somebody’s supposed to go do something, right? Real-time intelligence is the ability to say what’s the thing that you’re supposed to go do and is there a way that we can integrate it into our system and just

14:52 Say systemically go do X right so I don’t have to go through all the middle process of say let’s create a report and then let’s make it available to you and then you have to disrupt your flow and leave whatever it is that you’re working on and go to www.powerbi.com powerbi.com look at your report and make a conclusion and a decision. It’s the ability to just say, I know what those rules are. I know what needs to happen in my business from a business process perspective, and I want that to just go automatically happen. I I like the point you made up here. You you hit a couple

15:24 Items that I want to unpack here. You said the real-time intelligence or real-time information is locking a variety of action systems. I like that term. So Tommy and I’ I’ve aligned on this idea of like the semantic model core and key to a lot of things. Real time intelligence doesn’t necessar always have to run through the semantic model to get out to the other side. Tommy, I think you’re nodding your head there. Do you want to say something? Yeah. And I think the biggest thing is we’ve always been stuck in this assumption that real time is for

15:58 Hyper situations or niche industries, right? And I think that’s a lot what Mike and I have been really finding out especially with the event hub. we’re we’re coming from the Azure internet of things and real event stream where was like okay unless you’re secure internet cyber security supply chain like real time retail and inventory then if real time or doesn’t make a lot of sense from a cost point of view a lot of

16:33 People want real time but there was always that okay is it worth the cost. But I think what we’re finding with fabric and tell me I think if this has been also your team’s push as well as well is okay we’re going to do a lot more than just real time so to speak in all the outside or external data sources. What I’m noticing a lot with the event stream is really that ability for things within fabric to see what’s actually going on. Mhm. Yeah. It there’s a a lot of use cases

17:07 That are applicable to things like observability like absolutely right because it’s it really comes from a like that to really understand RTI. If you go back, you can look at some of the existing tools that we used to leverage in Azure, right? Azure event hubs, stream analytics, a tool called Azure data explorer we used to use, right? And so with real-time intelligence, we brought a lot of those tools together and we’ve really tried to simplify that journey to make it a a no code, low code experience. When you look at what it would take to implement a streaming platform, even as recently as about three years ago, it was a really

17:40 Expensive project to go do. It wasn’t something easy, right? to to your comment you just made a minute ago, right? back when I was a consultant like and I it hasn’t really changed I don’t think right the if someone would come to you in a business and say hey we could really use this data in real time like almost the very first question out of every data seem mouth is why right like why do you want that data in real time what is it that you’re going to go do what are all those different things and the reason why is because we knew the reason why we asked that question is because we knew that that’s a fundamentally different architecture that we needed to go down

18:13 We needed to go use a whole bunch of different tools we needed to go leverage something that we didn’t really have access to. We didn’t have expertise. We had to go use something like Kafka or we needed to go use something like if we’re using Confluent or we’re going to go create a roll your own process, but it was a whole separate project that really happened outside of the traditional data team process, right? It wasn’t something that we typically manage. And so like it was looking back on it now retroactively I think we can say it was really just us pushing back because we knew what it would take to go implement technically.

18:45 But what real-time intelligence allows you to do is say, “Okay, we’ve simplified this. We’ve brought this into fabric. Now, instead of needing to go through this whole separate project and this whole separate process where I need to go hire a consultant and wait six to nine months, now I can spin this up in a no code, low code platform in really like a matter of days, right? We have customers go from I just heard about RTI to I’m in production four weeks later.” Wow. And so like the the barrier to go from I didn’t have the ability to get my data in real time to now I can really

19:20 Easily build a solution has really closed quite tremendously and when whenever we can get to a no code low code type platform like that it’s also really easy to maintain right I know if I build something in an event stream and it’s all no code or I’m using SQL to go do a transformation and then I’m loading that data into an event house where then I want to go take these actions systems on top of it. I can build it, but anybody can support it, right? It’s not this niche thing that I need to go have a whole bunch of specially trained people to go do. I can

19:52 Just open that up and democratize it and say, “Hey, Tommy, I built this. Like, it’s all yours now, right? Like, let’s support it together as a team.” And so, that that barrier that previously used to exist is really gone. Like, it doesn’t really exist anymore. And so, it makes it much easier to to really cross this this gap. A couple a couple act a couple thoughts came up while you were you were talking there Christopher. One was I liked your question as a consultant because I do the same thing as a consultant. I think Tommy you do the same thing as well. Someone says I need this in real time and I want to ask another question. The

20:24 Follow-up question is okay from the moment you see the data till the moment you need to make a decision. What’s the time? Like what’s the timing of that portion? And I’ve also used this like it’s it’s like a gas pedal on a car, right? or or if you want to go faster you’re going to inher in inherently use a bit more C use. Now I would argue the CU ramp up of cost before fabric and now after fabric I think is different now. I think that the ramp up is still different, but if I’m doing things where I’m like regularly trying to real time things,

20:58 That’s more expensive than if I’m just doing like this other side of the world, which is real time, which is still like eventdriven architectures. That also seems to be another really useful pattern. , triggers that are running a pipeline of when a file appears or when something happens, I immediately do a loading procedure. So I look at this and I really want to unpack the customer’s need and saying what is the demand for real time and if you’re not making decisions within the first

21:30 Five minute minute 1 minute 5 minutes 10 minute 15 that’s an argument to say that you don’t need to overspend on the freshness of the data you can slow down a little bit and most companies are usually comfortable with this price balance mode of I want things like as of yesterday and older. Anything that happens today, we can like have a separate reporting. So, I also feel like there’s a a dichotomy here. And then let me just pause right there and I said some things there, Christopher. What What’s your reaction

22:02 To that? Is that your same sentiment there? It’s true. Like they think you you touched on two things, right? You touched on the one the difference between real time and say near real time, right? Yes. Correct. Do I really truly need this down to the second? Yes. and I think going back to your question and the question we were just talking about like unpack that right so I have a business user they come to me and they say I need my data in real time right my question is why like to to your qu your follow-up question what are you going to go do with that right yes and when you can when you when you add that question you can really start to

22:35 Peel back the onion or the cake or whatever you want like whatever layer how whatever layered tool you want to leverage to use your analogy right yeah but when you start to peel that back. You say, “Well, what is it that you want to do?” And I think that applies to every report though, not just real-time reports, right? If I’m building a report for a user, why am I building that report for a user? What is it that they’re supposed to go do with it? Right? There’s a difference between what is it that like I want to see what my sales were yesterday versus I I want to go get this data in real time because

23:07 I want to go make a a decision, right? I want to go create something in my business that happens. Right? If the answer to that question is, well, why do you need that in real time? What is it that you’re going to go do? If it’s, well, I need that information in real time because that supports X business process that does XYZ, right? Correct. Or if I want to look at, , if if a number changes from say seven to nine, right? Then that means bad stuff’s happening, right? And I need to go take some immediate action, right? Correct. When you look at the number of reports

23:39 That are out there that literally someone’s job is to watch that report all day long and look for those exceptions and those anomalies, right? Those exceptions are all things that why do we necessarily need a report for it when we can leverage AI or RTI or tool, right? And we frequently hear it called AI, right? I get pulled into a lot of conversations that are I want to do AI on my data and then we start talking and they’re like well I really want to know if if someone’s been on shift for more than eight hours right

24:11 Knowing if someone’s been on shift for more than eight hours that’s not an AI use case that’s a event driven use case that is a yes okay I clocked in at a certain time I know that I need to clock out by a certain time and if I haven’t done so then I need to go trigger an alert right I need to go trigger something it needs to become event driven for what it is. I don’t need a report. Then I need a watch because it’s not sustainable or realistic that I, as a user, I’m going to be taking every five minutes of my day going and switching back to report. If if you are, that’s an incredibly inefficient use of your day, right? No. if I can just train the system, and

24:45 Tommy, I’m going to stop in just a second because I can see you want to say something. , if I can just train the system to say, “Hey, I just want to create a rule.” And if I create that rule and I’m processing this data and I’m streaming this data in and I know that, , Chris clocked in at 9:00 a.m. and it’s coming off 5:00 PM and he hasn’t clocked out yet that I just want to go trigger some alert and just say, “Hey, Chris, you need to go clock out.” Right? Those I think where we’re really starting to see the gap between what real-time intelligence is truly capable of and what where we move past just the the streaming platform, right, of just, , hey, I’ve got a

25:17 Million sensors and I want to pull it all in. Go ahead, Tommy. Yeah. So, so you made some interesting points there and I just want to talk about the real time and that line that I see that exists. So, completely see what you’re saying, but there’s a threshold for me in terms of real-time action and then real-time reporting and especially for those who want to see their sales, right? So, there’s a few hurdles just to get that real time even if it’s on their existing dashboard, right? One of the

25:50 First thing is dealing with the team skills. It’s even though you’ve made it easier than I think it’s ever been, there is still some upskilling in terms of this streaming architecture, some of that event driven design and also being aware of the cost. So would you disagree agree or disagree that everything could be or should be real time or would is maybe a better way of putting this is fabric’s real time analytics opens up

26:23 The door for more possibilities. I think I would say first like when I look at what real time intelligence is capable of, it’s very useful for measuring events that are happening in your business, right? Doesn’t mean that it replaces everything. It doesn’t like if I want to go implement a a master data, , quality process, right? That’s not really a real-time intelligence thing. That’s not what we do, right? However, if I have a master data list or if I’ve gone through a data quality

26:56 Effort and I know that hey, this is the reference data. These are all the customers I have or these are all the products that I’m selling or these are whatever it is that I’m leveraging. All those what in the warehousing space we used to call fact tables, right? All those fact tables that are measuring something that’s measured in time granularity that says like, hey, as these things are happening, I’m loading them into my fact table. like those are all things that I can leverage to say should I make those event driven so I can act on those business rules as they’re happening right because one of the really key things of real-time

27:28 Intelligence is you don’t just have reports or a trigger whatever it is I get a whole host of things that become available to me right I can go look for anomalies I can go integrate this into AI applications I can go leverage all the different pieces that I want right when we look at a lot of the different things that are emerging and evolving in the space Right? Being able to I think proactively prepare your data in a way that it can be consumed in all these different techniques becomes very powerful. Right? Because then I’m not just constrained to say, “Oh, I can only make it available for a real-time

28:00 Report. I can only make it available for these things, right? But if I can process those events that are happening in my business, then I get a whole bunch of new opportunities that then become available to me.” This is interesting. There’s a there’s a question that came up in the chat here that I just want to unpack a little I think it relates very well with what the two points you’re talking about here are communicating which is there’s this misunderstanding I think Christopher that you were saying like what is AI versus what is real time but real time sometimes funnels into AI

28:33 Based things like for example anomaly detection right I’m just going to send in data in real time have like a a range of parameters or range of acceptable values of data and I’m looking for when when does that data point expand outside of that normal range and actually I gave a link here to the real-time intelligence anomaly detection article just want to point that out in the chat window that is in currently in preview so there is a preview for real-time intelligence with anomaly detection something’s happening data is

29:05 Moving along and all of a sudden something occurs it’s out of spec or out of tolerance so as one who has spent some time in manufacturing And I’ll give you I’d like to talk a couple real real world examples around where real-time intelligence makes I think a lot of sense. , one of them was when I was working as a intern way back in the day as as a mechanical engineer, we were doing a lot of tolerance specifications. And so what we were able to do is we were able to have digital calipers or digital instruments that were being

29:37 Applied to our workstations. So all of the users who were building parts, checking them, and as the machine would turn out these parts at the time, we were making medical implants at the time. , and so you’re always checking the tolerances on things. And so since we had digital calipers, digital micrometers, all that data could be sent to a central place. So we were centralizing the data and managers could watch the different tools in the area and watch real time analytics on what was happening on the tooling floor which was amazing because at some point

30:11 The tooling wears out. The program of the computer is the same path every single time. But if the operator isn’t paying attention, that tool will wear down just ever so slightly and start increasing the tolerances of different things or making diameters bigger or something smaller. So in that situation, it was really important because the material we were working with was extremely hard. Titanium, it’s pretty expensive. So you don’t want you don’t want to be like wasting the material or throwing away a lot of these parts because you’re spending real time to produce the material. , that’s real dollars saved by being able to

30:45 Monitor that in real time. And so if something is going out of tolerance within a part or two, you want to be able to quickly identify problem on machine 3. Let’s go talk to the operator. Let’s make sure the machine’s doing what it should be doing correctly. So that was one, I think, really good use case for like where we were using not real time and fabric, but like a use case of why you’d want to leverage real time. And and you touched on a couple of different things. And I I’ll come back to the RTI and AI comment in a minute, but I think, , RTI, one of the the really powerful things that allows us to do is exactly what you’re talking about

31:17 Because we can because we’re able to act and react on things in real time. You can very easily associate dollar value to your solution. Great point. to say like, “Hey, if this machine goes offline at 1 PM and I’m processing my data once a day, I might not know about that until tomorrow.” That could be, , depending on your organization, that could be $1,000, that could be $10,000, that could be a million dollars in lost revenue, correct? But when you’re able to process that data in real time and say, “Hey, it’s 15

31:49 And that machine just went offline.” Now, I can go back and say, “Hey, we need to dispatch somebody out there to go fix it right now.” like let’s go fix it because that is costing my organization money and that real dollars and that really moves the data team from being a report generation tool where it’s like oh if like I I used to have a manager a really long time ago and like he would always say like it’s just a report like it was very important to him he’s like we’re not going to we’re not getting up in the middle of the night to fix your reports like sorry like it’s just a report like you’ll be okay if we fix it at 9:00 a.m. we come in, right? like which which I love that attitude

32:21 Because I think it really shows and reflects like really where reports fall, right? They’re very reactive. They’re very reactive mechanisms to go say. But when you can get to that proactive point, you can say, “Hey, this business process is fundamentally broken. It’s not working right. People can’t do their job or the machine isn’t producing an output or, , or whatever business process is happening. there’s a tangible dollar amount to that that can actually be represented not as not as funny money but as real money that says like as an organization this is costing you like a

32:56 Million dollars a day. So being able to fix it in 10 minutes is much more impactful than being able to say oh we’ll fix it tomorrow whenever that’s happening. And that’s really, I think, where you start to see the data team elevate from the the report writer who’s just providing that to being like a partner within the business, right? To say, “Hey, we can leverage this data to go do all these different things for you for what it is that you’re looking for.” So, there’s there’s a lot of value, I think, to be said for being able to react. And we also know it’s it’s interesting because this as we pull data in more real time we find all

33:30 The things that you don’t necessarily see when you look at it in aggregate right when you look at a like when you look at say RPMs spinning on a machine and we we’ll have to switch our our analogy here in a minute because we’re going to get down the the machine road. but like when we look at say a machine and we say okay this machine typically average it’s at say like 3,000 RPM or to your point about the titanium right of hey I’m processing this if I’m only processing that once a and I’m averaging it or looking for it

34:03 Or something there’s a lot of nuances in the granularity of the data that you lose. Right? If I have a spike and it spikes five times in an hour to 5,000 RPMs and then it comes back down to 3,000, then like I won’t see that if I only process it once a day, right? If I’m able to process that in real time, I can see, oh, it’s spiking at 5,000. Why is it consistently spiking? And then when you can leverage things like anomaly detector and some of these other action systems that we’re building to make it really easy to go consume this data in real time, you can really

34:35 Quickly and easily see, oh, here’s an anomaly for what it is that I’m seeing, right? now I know that this machine is causing an issue or when when the RPMs spike to 5,000 within five times within an hour typically machines fail within seven days after that. So going back to our now I can generate real business value, the closer you can get to that predictive analytics line, you can go fix that machine before it even comes offline, right? , not only are you not incurring the outage at all, but you’re

35:08 Saving that money organ like that money, that company money, right? Because they’re able to continue their business processes uninterrupted. So So okay, Tommy. Yeah, you Tommy, you go first. I have two reactions I want to talk about. This is so good. I’m going to slightly shift. So, Mike, you go. You go because we’re I don’t want I don’t want to pivot off this point here cuz So, we’re talking about like RPMs and machines and manufacturing. I do want So, I’m going to bring back an analogy. Everyone everyone on this call or everyone on the podcast will understand. Okay. And Tommy, you’re going to laugh at this

35:41 One, too. I’m smiling and smirking over here. I’m like, “Oh, yeah. This is good. If only I could get real-time analytics on my CU consumption in my fabric capacity because Tommy runs that report and it spikes my CU usage and he’s exporting all this data and it’s like it’s killing my capacity and I need to know like to me like it yes it’s RPM on a machine but like realistically like to going back to real dollars going going back to like real impact like real data loss prevention type things like those those are scenarios that organizations should definitely be looking at around

36:13 This space which is hey look I’ I’ve this is where I I look I like the analogy of the Ring doorbell right and again it’s Ring whatever the Nest version is their version it doesn’t matter I want to be able to say these are the activities that I really care about especially in the monitoring space hey look I really care about people exporting to Excel or when they’re hitting analyze and Excel queries what does that look like or queries that are taking longer than a minute and a half to run from the analysis services model those are like real time events where people are trying to do things in a

36:46 System and stuff is potentially immediately going down and I want to start adjusting or working with that individual immediately to say okay the behavior is wrong you shouldn’t be building this massive pivot table you should first apply some filters or filter down the data before you use these things because you’re killing our capacity so there’s there’s all these other experiences where I think that makes a lot of sense in the machining world but everyone should be relating to in a capacity consumption world as well. All of that can be very real-time decision-m and you if you

37:21 Even really wanted to go all in on this, you could even say well when my capacity hurts a certain level or certain threshold, you can trigger an Azure automation, a runbook that says scale up to the next CU and then when you recognize there’s no more consumption or the spike has gone away, you can then scale back down. There’s there’s a whole bunch of like really good use case things that to like autoscaling up your down up and down your fabric SKs to help you manage your capacity. So that way when you have those high demand

37:53 Usage points you’re you have the capacity available and when you don’t need them they can turn off. Another example here is you can use a fabric workspace, turn it on, load the data engineering, hydrate your semantic model, and you can literally pause the se the fabric skew, turn it off because you’re no longer data engineering, and you can just let the semantic model hum and serve the data from a different workspace. There’s all these really rich use cases of like really tuning and cost optimizing if you have real-time intelligence or event- driven architectures that solve these

38:26 Kinds of problems. All right, I’m gonna pause. Yeah, that’s a great example and it’s a great use case for exactly what you’re talking about. Yeah, you certainly can, right? Leveraging activator in real time. Like you can really start to get very sophisticated with some of these things that you want to go do. Love it. Sorry, I didn’t mean to interrupt there, Tommy. I have one more example I’d love to throw down here for the airlines, but if you want to go, I don’t want to take any more of your time. Honestly, Mike, with with our fabric consumptions and trying to do real time, there’s a whole other can of worms. Why don’t you do the the airline one

38:58 Because that that’s a good example. It’s another good example. So Chris, you or Christopher, you were just talking about like real time cost me real dollars and and I agree with you, right? It’s it’s the decision point of when something’s happening. So I was working with an airline and , I learned a bunch of things I didn’t know about airlines. what happens behind the scenes? How does maintenance occur? Right? There’s a lot of preventive maintenance that happens inside the airline industry. And so the the airplane is kicking out gigabytes of data every day. Like just every like in a day it’s gigs of information that’s

39:31 Getting pushed somewhere. Doors are open, sensors are on, all this stuff. Well, I found out that any delay at the gate, like so you’re you’re playing at the gate, and again, this is one that everyone can relate to. every day that if that plane’s at your gate and it’s delayed a minute, 5 minutes or 10 minutes, it’s something ridiculously like every minute you’re sitting there at the gate is costing them like thousands of dollars a minute just to sit there. So for airlines to not have like real information is like has everyone checked in? Is everyone ready to go? Is everyone on board that we need

40:03 To have on board? They want to close that gate on time every single time because that’s costing them real money for every minute cuz that plane is a large investment for them to have it running. So we were building real-time reporting around maintenance where these planes have to go. Can we know the location of where the plane will land and what parts need to be maintained and how can we have them delivered to that location so way the maintenance can occur at that site quickly and get that plane back into service. This was another one of these use cases where we

40:34 Were actually building in this example it wasn’t fabric only. We did a little bit of fabric stuff. We were using data bricks and a combination of fabric to really stitch the data together. But back to your point Christopher was it landed in a report that was updating every minute and we had someone checking on individual planes and picking on individual flights and that was the area that we were really focused on and it really elevated the BI team to like really focus on we’re not just a cost center now. We’re now we’re saving money to the company. We’re adding real value back to the organization.

41:05 Yeah. You really get to a point where you like not to to blee or the point, but you really get to a point where you start to like really flip the script and say we’re so much more than just this reporting arm for what we’re building because you do integrate. And so I think like to using your airline example airlines are a great example of there’s there’s streams of data coming all over the place right not just from not just from the flight but there’s also baggage and there’s crew assignments and there’s passengers and bookings and all

41:37 Kinds of stuff that’s constantly flowing right and when you look at those and you look at how that typically operates within an organization today or with an airline there’s there’s there’s a whole bunch of different things I happen on the IT side, right? There’s the reporting piece that happens from the data perspective, but when you have those types of scenarios like what you just described, , and if you’re not using a report for this, maybe they’re using a real-time application for this, right? Maybe I have an application, I’m pulling data, I’m trying to go organize this and pull all this data from all these different places and see this

42:09 In real time. So even if you’re not necessarily doing it as a data team, doesn’t mean it’s not happening within the organization. It might just be happening on another team, right? Yes. or a different system. It’s in it’s in whatever. Yeah, it’s it’s happening somewhere. Correct. Right. And so I’ve got it happening. It’s happening somewhere. Like if I can pull that together and collate this together so I can make this very easy where I can start to build all these things, I can wind up with that single source of truth, right? , I was talking with an airline recently and they were telling me like, hey, if like if one of their challenges that they have today is, , if we pull this

42:41 Report from this system and it says there’s 78 people on the flight and then I pull a report from another system and this says there’s 79 people on the flight, that means a gate agent has to go down to the the airplane and count the people and say how many people are on this plane. Yes. But can’t have that. If you right if you can get to that single source of truth then you can start to say how do I like and now I now I can build it whether it’s a real-time application or whether it’s a report or whether it’s an activator alert or whatever it is I can build all these different things on top of it and that becomes very freeing to an organization because I can build a lot more than just a report that someone

43:15 Has to go check I can go leverage all these applications that people are going and consuming and so I’m able to not just prep my data but also build my data in a way where it can easily be reused across a wide variety of use cases where I where I don’t have to repool data. Interesting. Okay, I’m going to pause there. Tommy, you got something to say. Tommy, go ahead. I think it’s time to talk about the team and the organization here because I Yeah, I don’t want to talk about that. Oh, they should just go ahead and do it. Nah, let’s move on.

43:46 The tech I’m just teasing you. Okay. All right. I’m just teasing you. That’s fine. We got to wait at least another six minutes so we can get right to the very end and be like, “Okay, and we’re done.” And we No, just decent. I think it’s time it’s time to talk about where does this actually fit in when we think about the adoption roadmap. And I want to talk about this in a few areas. Obviously not trying to boil the whole ocean here, but where do you see real time analytics

44:20 Intelligence fitting in in terms of lakehouse notebooks one direct lake warehouse etc. We have all these features, right? And you can only do so much at one time. Where first, where would you say real time? Obviously, there may be some bias here, but really we’re all starting from the ground up. Where’s that from a priority level of getting to know it, understanding it, and making sure that it’s cost effective? And then two, , when it comes to actually dedicating

44:54 Team and resources around it, , when does it get integrated into your center of excellence from a self-service side of things? Yeah, so it’s a that’s a great question, Tommy. , I think there’s there’s a couple of different things. The the short answer is real-time intelligence can really be whatever you want real time intelligence to be, right? You can leverage it as I’m going to go build it for this individual business use case or you can leverage it as that that central source of truth for however I want to pull it right a long time ago before the age of data links we had this

45:28 Concept called operational data stores if you guys remember those right so the idea was before a data warehouse right I had my ODS my ODS had a real time view of everything that was happening in my environment and when I had to build operational reports I built put them on top of my ODS because then I could empower the business. I could give the information that they needed without having to send them all the way back to the transactional system and it was a it was a an easy way to alleviate the load. Right. Sure. When I got to when we got to data lakes

46:00 Like we don’t really have a similar concept now. It seems like the the name of the game is I’m going to process the data through my data lake as quickly as I possibly can. Right? Mhm. When you look at RTI and when you look at where does it fit into that overall place, right? We have customers leverage RTI as a as an ODS and say I’m just going to bring all my data here. My event house within real-time intelligence is going to become my central operational data store and then I’m going to go send it downstream through one lake to a lakehouse or to whatever external tool that I

46:33 Want to. Right? So like you can you can become very freeing for how you leverage it or how you want to go implement it. You can also just implement the data into an event house and say hey what like I just want to use it as my my light data warehouse for whatever it is that I need right so there’s a lot of different ways that you can really leverage it. I think where it’s it’s most impactful is really when you can get in front of the the lakehouse or the warehouse or whatever it is like so to your point about hey I’m looking at notebooks I’m looking at lake house I’m looking all these different things right

47:05 You still want to process that data you still need to go through those steps but if I can leverage RTI to unlock those operational capabilities beforehand then it’s really the first we see it as the first step in the process right and that’s bias coming from from me and my team inside of Microsoft. So, in candidness, you might get a different answer if you go talk to somebody else, right? But like when you look at what RTI can leverage and really where it fits, putting it upstream of whatever downstream analytics process you want to do really

47:39 Starts to enable you to take advantage of all these systems while still building, , your lakehouse or your warehouse or whatever. , and so there’s there’s a lot of opportunities where you can fit it in into that operational con construct. But when I think of RTI in my head, because I’m I’m an old school analytics guy, right? , in my head I it equates very much in my head to what we used to call ODS’s, right, of I have an ODS. It’s just it’s an ODS that’s supercharged and AI ready and AI capable and all the other things

48:11 That I want to go do. But there’s a lot of other things that come with that as well. And I know those are some things that we’ll talk about in a couple of weeks here, right? When we talk about things like data modeling, right? In an ODS, you didn’t model any data. You just brought it in, right? And it the way it came in is the way it came in, right? Like a lot of those same concepts apply, right? Of I just want to bring data into my ODS or my RTI capabilities and then once I’ve got it available for all these action systems, I can go send it downstream for wherever I want. , if you’re a under the the newer context,

48:44 We’d call it like under the Lambda architecture, right? We have like the hot path and the cold path, right? These two different things. I like this. I was just going to make a point about this that like ideas like reporting should be boiled down to like, hey, here’s all the things I care about in the context of today, the real timey things, and here’s the things that are like yesterday and older where the data is not changing. It’s just static. So, you can you can spend a little bit more time processing to get structured for optimized reporting as opposed to the today’s data. you just need to see it right now. It’s got to get to a dashboard ASAP, right? And so like we’ve see we’ve seen Lambda architecture really evolve over

49:18 The last several years to be like I have my hot path and my cold path, but when you really look at it in real implementations at at real organizations, it’s normally lambda cold path architecture and it just blows all the way through, right? And the hot path is like, well, we’ll try to do the processing of the data faster. Like that’s going to be our hot path layer, right? That’s not really lambda hot path. Lambda hot path is much more about like that RTI ODS perspective of like hey I want to process it and then I’m going to go drop it to something and I’m gonna move it. I think fabric is the first tool on the market

49:49 That I’ve seen that really you can very clearly delineate and say this is my hot path and this is my cold path. Right now does that mean I can only leverage RTI for lambda type stuff? No, it doesn’t because I can leverage it for other things too. Right? We have customers who use RTI is just the Kappa architecture. Right? They’re like I don’t need to mess with Lambda. I’m just going to do everything in Kappa. And so it really comes entirely up to you as an organization and as a data team. How do you want to leverage RTI? Right? The the name of the game is I want to act on things as they’re occurring. And I want

50:21 To go from becoming a a cost center to a revenue generator, right? And I think that’s really the mental switch to to I think really master what real-time intelligence is really truly capable of because when you can really grasp that concept then like all these other things become available to you. I think we’re going to have to have another episode on like the userbased piece. So again Tommy that’s a big question I think you asked around like users and where this fits and who needs to know what. What I would argue right now is my experience has told me so far.

50:53 People who are stepping into real time have either had some experience in other tools around real time and they’re now like happy that they come to fabric and there it’s available to them, right? Hey, we know we can do real time. We have some vague understanding of how it would work in our ODS or an existing system. I’m finding quick wins with those those customers quickly moving into fabric and just starting there. I think Tommy your question here though is really more around like what about those other users other users that are like potentially going to

51:26 Start experimenting for the first time or starting to build their first architectures like what’s the learning path or how do we get them up to speed I think I think it’s possible but I would definitely I think cater this real-time analytics into the more of the data engineering role that that is a that is the persona I think that fits very well with RTI And I think we’ll get better. There’ll be more RTI for PowerBI, maybe data modelers and report builders as well. But I think for now, I me personally, I would put it in that data engineering space for now. And

51:59 As things get better, it’ll probably be more accessible to more individuals across the organization. Christopher, would you agree with that or is that a am I making a very weird bold out of the normal statement there? I don’t know that you have to necessarily fit into a data engineering construct in order to really make the the key uses of RTI. like you can any analyst who knows that they want to get their data faster and wants to process it like that’s the goal, right? So like I’d be surprised if that was the the

52:32 Primary I think persona of like who our primary developers are, right? talking to customers for what we’ve seen like sometimes it’s not even a data team at all right we have customers who like the the analytics team or like an analyst discovered it and is processing it and they built everything on their own right without any without any capabilities from the from the central data team we have other customers who like the the not central data team built it and then the central data team took it over right for all the different things and we have

53:03 Other central and we have other central data teams too that just build from the ground up. Like we talk a lot about on the real-time intelligence space around where real-time intelligence really fits into the the data mesh type approach, right? So when we think of like data mesh and fabric and all the different things that are capable of, real time intelligence really fits very nicely into how you scale those nodes out into all the different things to empower all those business teams. So like I think it can be built by data engineers. The goal is for it to be the

53:38 The the the Microsoft Ratatouille effect, right? The the PowerBI effect of old, right? That the anyone can cook approach. True. in many ways, we draw a parallel on on the cat team to real time intelligence and what PowerBI was 10 years ago, right? So when PowerBI came out 10 years ago, it was a very niche product. There weren’t a whole lot of people. , I can tell you to this day, I remember my my boss at the time calling me and like, “Hey, there’s this thing like you’re the analysis services expert. Like, you’re the guy who knows SSA. It’s like the back of your hand. , it’s called PowerBI. Like, customers

54:12 Are starting to ask about it. You should go pick it up and you should go learn it.” And so, I focus on PowerBI for a long time on all the different pieces, but then we look at where that’s going. And so we think RTI really fits very much in that same niche, right, of there’s so much capability for what you can go do with it. Like it’s really is a PowerBI like just PowerBI next. Okay. Right. Of all the things you can do. So if if if only data engineers can use it, then I would say that we need to make some changes because we’re not doing something right. Then next podcast episode I’ll have to be

54:44 Like RTI for everyone. Change my mind. I love it. Awesome. Let’s do a wrap here. Tom, any final comments or final thoughts around RTI and our initial discussion here around unpacking what real time intelligence is looking like? Honestly, what I’m looking forward to is actually having you on Chris and really talking about more than just what is real time, but also again how do we actually deal with this with the data modeling for the PowerBI Pro? how do I actually migrate over and talking

55:18 About cost and ultimately adoption. Awesome. I like this conversation a lot. I think this is a really good discussion point. I also think there’s a really strong story that we need to really convey here which is we’re talking about real-time analytics. A lot of the examples I gave earlier where we’re like, , money’s involved. I got to make decisions quickly. But I also I think we need to unpack and be very mindful of eventdriven data loading. Event- driven architectures is also very important here as well. And that’s another really big story to what RTI is doing. And I

55:50 Think that actually speeds up a lot of your data processing and timelining of things for medium all the way up to very large organizations. There’s just a lot of coordination between different teams and different systems and event driven based things assists with that. So I want to also very very much highlight I think RTI is for every organization. everyone should be looking at it in various forms for what they need either processing data decision quickly or even doing a lot of the event driven things. Christopher, thank you so much for your time. We really appreciate you joining us. I know you could be doing a lot of

56:21 Other things like really helping customers out and being, , proactive and and making sure everyone’s happy on your side. We do appreciate you spending some time with us and enjoying a bit of time with us as we unpack what real time analytics is looking like inside the podcast. That being said, we thank you so much for joining us everyone. Thank you so much for being on the podcast and listening today. We know you could spend your time anywhere else. If you want to have these episodes without any advertisements, we do have a membership on the YouTube channel. So, please go ahead and join the membership. We’d love for you to join us over there. that help support what we’re doing here, the activities,

56:54 And helping us pay for the apps and the things that we use to help keep the the podcast going. If that being said, Tommy, where else can you find the podcast? You can find us on Apple, Spotify, wherever you get your podcast. Make sure to subscribe and leave a rating. It helps us out a ton. Share with a friend since, , we do this for free. Do you have a question, idea, or a topic that you want us to talk about, maybe more real time? Well, head over to powerbi.mpodcast. Leave your name and a great question. And finally, join us live every Tuesday and Thursday, 7:30 a.m. Central, and

57:28 Join the conversation on all of PowerBI tips social media channels. Chris, thank you so much. Appreciate your time today, and we’ll see you around next time. Thanks everyone. Thank you.

Thank You

Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.

Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.

Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.

Previous

Data Agents and Semantic Models – Ep. 466

More Posts

Feb 25, 2026

Excel vs. Field Parameters – Ep. 505

Mike and Tommy debate the implications of AI on app development and data platforms, then tackle a mailbag question on whether field parameters hinder Excel compatibility in semantic models. They explore building AI-ready models and the future of report design beyond Power BI-specific features.

Feb 18, 2026

Hiring the Report Developer – Ep. 503

Mike and Tommy unpack what a report developer should know in 2026 — from paginated reports and the SSRS migration trend to the line between report building and data modeling.

Feb 13, 2026

Trusting In Microsoft Fabric – Ep. 502

Mike and Tommy dive deep into whether Microsoft Fabric has earned our trust after two years. Plus, the SaaS apocalypse is here, AI intensifies work, and Semantic Link goes GA.