PowerBI.tips

Fabric Real-Time Analytics – Ep. 274

Fabric Real-Time Analytics – Ep. 274

Fabric Real-Time Analytics looks slick in a demo: flip it on, stream events, watch visuals move. In this episode, Mike, Tommy, and Seth break down what’s actually happening under the hood—and what you should validate (permissions, storage, and cost) before you call it production-ready.

News & Announcements

  • Build Wireframes with AI — PowerBI.tips wireframing now supports AI-assisted layout detection: drop a background “scrim,” let the tool find the bounding boxes, and auto-place visuals so you spend less time on alignment and more time on storytelling.

  • Fabric Change the Game: Real – time Analytics — A walkthrough of Fabric’s streaming stack (eventstream, KQL database, and related patterns), including an end-to-end example of ingesting real-time events into a store you can query and act on.

Main Discussion

Topic: Fabric Real-Time Analytics (eventstream, KQL, and “real-time” tradeoffs)

Real-time is rarely “just faster refresh.” The conversation focuses on how Fabric is stitching together streaming ingest, query, storage, and reactions (alerts/activations)—and where the operational risks show up.

  • Treat it as a pipeline, not a visual: source → eventstream → KQL / lakehouse → reporting + actions, and be explicit about which pieces you actually need.
  • Decide what “real time” means for you: a push experience (live visual changes) is different from a poll/query experience (latest stored records), and users feel that difference.
  • Don’t lose the signal: if you care about analysis later, land streaming data to durable storage (e.g., Delta) so the momentary “blip” becomes history you can trust.
  • KQL in Fabric can be powerful for high-volume event analysis, but it introduces another language and data-shaping layer—plan how it connects to your semantic model and downstream consumers.
  • Governance still matters: workspace-level access can make it hard to isolate assets cleanly, which pushes teams toward workspace sprawl as a workaround for missing granularity.
  • Streaming can be a silent cost driver: “easy to turn on” plus “forgot to turn it off” is how you wake up to weeks of compute burn from a harmless test.
  • Data Activator / Reflex may be the broader win: reacting to meaningful changes (a KPI crossing a threshold, an event pattern emerging) can deliver value even when full streaming isn’t justified.

Looking Forward

Pick one operational use case (manufacturing, IoT, near-real-time monitoring, or alerting) and pilot it with clear cost + permission guardrails before you scale Real-Time Analytics across your Fabric tenant.

Episode Transcript

0:30 good morning everyone welcome back back to the explicit measures podcast with Tommy Seth and Mike Hello everybody welcome back good morning it’s a morning it’s Thursday it’s a it’s a Thursday jumping into our main topic for today our main topic we’ll be talking about what does Real Time analytics look like in fabric how’s this going to handle doing some do we think this is going to be a thing will this actually be something people will want to use go into that our main topic for today but before before

1:00 main topic for today but before before we do that let’s hit some news articles that are coming out today we have an amazing announcement from the tips and theme generator space so we’ve been working quite hard here at the tips Warehouse we’ve been trying to build things for tools that help you build reports faster quicker and easier we just recently released in the last couple months this thing called wireframing where you can build your own Pages you can add your visuals to the pages well today we now have out AI

1:30 pages well today we now have out AI generated wireframes for you it’s crazy so it it really helps out your workflow it makes a lot of things automated for you the idea here of this C feature is you can create your background image and we like to call those a scrim so you you create your various scrim or pages with images on them and then if you have bounding boxes on those pages for where you think the visuals will go you can now just run our AI on top of it and it will go through scan your image and automatically place

2:01 scan your image and automatically place all the visuals on the page for you so it’ll actually Define it will find the boundaries of your squares and pre- the visuals and then you can change the visuals like you can change the visual type we do some guessing and we also have added another feature where you can use an automatic layout so we have some predesigned orientations of where we think visuals should go so you can click a layout and then immediately add your visuals where they want to go I think it’s going to be a huge timesaver and now you can focus Less on on getting every pixel correct on the page

2:33 getting every pixel correct on the page and you can just focus on getting the image right for the page which I think will speed up your workflow quite a lot Seth’s been playing with it a little bit Tommy’s been playing with it yeah it does man like one so if you spent any time building your own background images right like layout whatever sort yep there’s still the timec consuming aspect of like getting the visual even even within our tool like creating a visual visual sizing it appropriately alignment spacing like all which are

3:05 alignment spacing like all which are very important especially if you’re spending a bunch of time on the look and feel which you so this is exciting man like this this takes that very tedious process and it just bam like one button generates the visuals and with it like the cool thing is regardless of whether or not it selects the right visual for the space it’s so easy to change right it’s it’s the dimensions it’s the set like check it out you guys have to go check it out

3:36 it out you guys have to go check it out super cool and I love the fact that you don’t even have to have the background image right like we we’re applying it in multip multiple different ways in the tool where you can just have the a regular color and then there’s still this option of like oh do you want some wireframes that we’ve you you want some wireframes that we’ve generated that have this spacing know generated that have this spacing whatnot as a as a starter yeah Okay click the button boom so yeah yeah I’m I’m super excited about it because it does extremely speed up that process of just getting to the

4:06 that process of just getting to the point where you’re going to apply a theme or whatever so we’re super excited about it it is out in the wild as as usual we’re a little behind so we’ll have some documentation and videos coming out here shortly to lot off the press yeah how how the how it all works but super excited about getting that out the door I think any Impressions from your side SE your dead on because how many times 80% of what we do is the data modeling the data cleaning oh now I got to do the the layout side and what do I want to fit in

4:37 layout side and what do I want to fit in I I’m amazed it’s it’s it’s a click of a button I think my favorite part is I don’t need an f64 license or skew to use it either right no there’s no you don’t have to pay more money to get all the features so we do give you a trial ability you can try it on two pages to see if it works for some of your scrims if you want a l larger than two-page report you can buy the tips plus subscript description and you can get unlimited pages in your reports so we are giving you the feature fully functional fully available to run on a two-page report so you can see how it

5:08 two-page report so you can see how it works if it works with your background images your stuff so we want to make sure you can test it out but if you want a larger report usually you want to report with more than one page or two pages so we recommend you go by the the tips Plus subscription and then you can save your your backgrounds your images everything gets saved to your profile so you can come back and use it later so I think it’s going to be an amazing amazing performance Improvement anytime you’re building a report so we really like it we’re trying to make it fast and easy to build stuff we hope you do too

5:39 easy to build stuff we hope you do too all right with that let’s transition over to another topic that I found here here’s the article and I’ll put it here in the chat window as well this is a an article from Microsoft talking about this is the first time I’ve seen them really publish and there may have been other things I think this is maybe just the first time they’re really announcing things here they’re talking heavily around how do you do the The Medallion architecture within the Microsoft fabric Blake house pattern so there’s actually a little diagram here this is the article that came out

6:09 here this is the article that came out today Kim Manis I had to give a mad credit for this one I can never find things on Microsoft learn I Google for things and it doesn’t show up so when new articles appear I have to like go find where it came from and and Kim shared it which thank you Kim for sharing this article this is where I found it and it starts talking a lot about how does one Lake how does this work inside fabric where do you put your different lak houses what is where do you build your bronze silver and gold and so love this architecture been building this for years I think this is the right way to go the more I

6:39 this is the right way to go the more I look at Microsoft and what they’re doing here they are more closely aligning to Delta tables which is amazing it’s helping so much it’s making my life so much easier but there’s one little sentence that I caught about halfway down the article and it talked about some guidance around how to build this one one one of the areas they’re talking about for security and control you have a bronze Lakehouse that’s a separate Lakehouse and silver and gold so there’s three separate lake houses

7:09 so there’s three separate lake houses that you’re building one with each type of data in it they recommended if you need more security controls around those different lake houses that you should build additional workspaces so if I think this thing through I could have an entire workspace just for bronze I could have another entire workspace for silver and I could a final workspace for all things gold and then potentially you might have additional workspaces around data sets and or reports so where previously we

7:39 and or reports so where previously we would have two workspaces one for data sets and one for reports because we could separate the two yeah Johnny you’re you’re in my opinion Johnny in the chat just threw down the Emoji P puke icon we are now at a place where we could potentially have for every environment that you need you could potentially have five workspaces I five five workspaces are you sure about that one for bronze silver gold yeah and then one for data sets and then one for

8:11 for data sets and then one for reports okay that that’s like the starting point I think like if if you listen to their guidance and say look I need if I if I don’t want the team doing development to have access to the test workspace right because you may not want your developers playing around and test that’s for testing right if you don’t want your developers building things in prod maybe you don’t give control to so if you want to remove control from the people building the things in the Devon space you’ve got to physically separate them by workspaces so so isn’t the other

8:41 them by workspaces so so isn’t the other recommendation that you just build a different workspace for Dev test prod okay so okay so five by is what I’m getting at yeah yeah five times three more 15 20 workspaces yes for Dev test well Dev QA uat prod environments or Dev test prod yes and then I’m going to have yeah and that’s if you will have all your engineers in the entire development pipeline which you’re not going to you’re probably gonna have them

9:11 going to you’re probably gonna have them split up this goes back to our sorry no where do domains fit in this right like this is a perfect place where domains would maybe help but domains don’t it’s a higher level over the workspace okay everyone’s everyone’s J in the chat and I chat I agree with you unmanageable Johnny your spot on he’s got Dev test prod raw Bas curate data sets like there’s there’s so many good thing we have git integration let’s

9:42 good thing we have git integration let’s just say that because because now we we can at least automate things to some degree a little bit here but this is going to be really pushing perview that’s for sure the only way to manage

9:50 that’s for sure the only way to manage this yes so I this goes back to our earlier conversation where we talked at length about workspaces and one of the things that I that blows my my mind is this is a permissions thing why do I have to build more infrastructure for a permission if I have an environment or workspace then give me more granular permissions that allow me to just use security to say Yep this is what they have access to or this is what they don’t have access to unless these objects Mike to your point I can only create one Lakehouse in a

10:21 create one Lakehouse in a workspace then like you could create as many as you want but like I think the idea here is like you can’t you can’t tell people not to touch one of them right you can’t quarantine them without having the so which is just the control the control the control surface area is like the work so the workspace is like who actually gets access like the Creator right part of this thing and it hasn’t changed like this is the same Mantra that Microsoft has been using for like all workspaces and that’s the

10:51 like all workspaces and that’s the problem with this even if you even if you do add like what you in my opinion you should have right now which which is security levels of this is if you start recommending that people go down this route how do you unwind from it if you do add security to that yes right like this is like to me extremely frustrating because these aren’t folders right you’re creating a workspace it’s an ecosystem right and and you’re making

11:21 ecosystem right and and you’re making that it’s the same arguments that we had around what workspaces were in office like office 665 groups all the stuff they SP up in the environment that had 0365 admins just losing their minds every time somebody in powerbi was like HR test like boom everything SharePoint sites all the whole this is the same thing yes I think mind you’re right the the the

11:51 think mind you’re right the the the conversation we had about roles in workspaces everything in a lak house is available if you have permissions on the workspace which still blows my mind because if we’re elevating The Lakehouse and we’re in a sense de-escalating the the semantic model why does the semantic model have so much control around who can build who can edit who can read we have that off the semantic model but we have nothing on the Lakehouse on this where if this is the way because if this is a yes yes and no there is a

12:22 is a yes yes and no there is a little bit of control at the lake housee so I can I can give you access to a lake house without giving you access to the workspace so that that does exist which is similar to like models semantic models because you can you can provide build access or not to the data set that’s not part of the workspace so at at some level I do agree with that piece as well similar as we’re on Sirus FM that’s the that’s the how much this relates we’re that is as similar as us as actually a radio show

12:52 similar as us as actually a radio show there’s some crossover but it’s like big I I get the argument that like okay you want a separate workspace for know you want a separate workspace for Dev test prod fine right that delineates large swaths of things I can do into th those those buckets this is just not one of those things I do like Johnny Johnny Winter’s comment you get a workspace and you get a a workspace oh my God there’s a meme incoming if someone someone rip it out let’s get the meme going I will support it and I will promote that Meme so someone go build out the the Oprah

13:23 so someone go build out the the Oprah meme about everyone gets workspaces now so regardless I’m just happy one thing is I am just happy that Microsoft is now providing some recommendations around how they see things getting built because it’s been quiet from their front they just deployed they just made all the artifacts and say here build here’s here’s all the stuff you can build now and then here you go and then it’s that’s it so I do we’re going to have to figure out how this is getting getting managed one thing I’ll also note here and again if we think

13:53 also note here and again if we think about lakehouses lake houses should be a larger design effort anyways so it’s so it’s so contrary to like okay I’m the business unit and I’m just making reports and semantic models right the warehouse I do think and this is where I’m going to lean on what Tommy really recommends here that’s more of a central Team Action yes you can have business units doing their own thing but I would argue that in that business unit they’ll probably put their lakehouses in the same workspace as their data sets

14:24 the same workspace as their data sets yeah their semantic models and their reports they’ll just build one big pile of everything everything that probably makes sense for the business unit because that can isolate them to only things that they they can touch and they can control their access that way when you’re talking Central bi and you’re talking large dimensions across the entire organization when you’re talking where all these inter Enterprise tools come from exactly that’s to me that’s where you’re going to need to start talking about the larger control surface area of I think

14:54 larger control surface area of I think for that Central bi team you’re going to need better planning to figure out okay who really really needs to have access to these different artifacts and how things are going to work through each of the items all across those elements to me the the last thing I’ll say about this I I still think the problem is we have still IL defined and I think Microsoft too is what is the best practice for the lake house how much data goes into a single lakeh house because we’ve talked about this being it for one semantic model or we’re talking about this being for the entire bi team

15:25 about this being for the entire bi team one single lake house that’s IL defined still where does the data warehouse going to if we’re actually going to play with that I’m looking at the documentation there like two two ways but to me this is how how big is our Lakehouse how big is a single Lakehouse right now and that is not defined right now in a fabric and and typically but typically Microsoft doesn’t do that and typically I’m fine with that because there’s enough capabilities around the

15:55 there’s enough capabilities around the tools that that you set it up however it works best for you this isn’t that scenario though like these are like what I’m seeing and I’d love to be wrong here so if someone from Microsoft sees this like shoot me the road map like let me see what like all this is going to be because like right now you’re you’re giving me like only one path to go or it’s got to like we’re trying to figure things out as we go and there is no other options here because the way I want to do things the way I

16:26 the way I want to do things the way I think I should be able to do things I can’t so I agree with you Tommy I it at bare bare minimum if there are limitations then like at least describe what the best scenario is for me to like put myself in a holding pattern but blowing out an enironment to like 15 20 workspaces which you would have to do in certain cases for an Enterprise solution right not how do I unwind from that if if I do get something better in the

16:57 if I do get something better in the future yes I don’t know that’s a frustration part for me yeah so at the end at any rate one thing on one hand I’m happy that Microsoft’s providing some guidance around what they think is the right way of going about this one I still feel very firmly around there’s going to be lots of patterns that are going to evolve from this think through what those patterns look like for your company you’re going to have to if anything else it’s going to just require some more planning just think about and plan what you’re going to try to do and

17:28 plan what you’re going to try to do and do the best you can to plan it out lean on others who are building things similar and and maybe ask for questions hopefully we’ll talk enough about how these planning things work for us on the on the podcast so stay tuned we’ll probably have more conversation around what what patterns we’re finding and what we think will work and what what doesn’t work hopefully maybe the planning is more about what won’t work versus what will work because it’s wide open right now you can do anything you want I think that’s why we’re GL I’m glad we’re all reverse engineers at heart because this is a method without the solution at first what is

18:00 without the solution at first what is the end goal and that’s where I think all of us are getting stuck like okay what are we building here like what’s the final solution so all right I’m done I’m done I’m done yeah I this one’s gonna this one’s gonna get sticky it just is because especially when you throw like data bricks into the conversation right especi in unity catalog and some of the things that Adrien just brought up is you’re you’re talking about it team wanting to replicate and use security security

18:31 replicate and use security security groups that that are well defined in an Azure Cloud ecosystem yep that you’re saying don’t work here we’re we’re just going to not have it work against any of the same Enterprise ETL workflows or storage systems that we have access to currently what do you mean by that statement I’m not sure if I followed your comment there because you can still use the same security aure sure but if I’m separating things out into permission levels already because I want want to separate out bronze silver gold different layers of an like my

19:02 gold different layers of an like my access that’s an access thing I can already do that from like the GetGo I don’t have to manage that in other tools to to level set permissions you mean like I can get very granular like like you’re talking like things like data bricks with unity catalog right so okay oh okay understand so yes then then I understand your point because things like datab bricks and unit catalog the data bricks basically has like God mode on top of all the data it can read and write everything and then the unity calog says okay I know

19:32 then the unity calog says okay I know who you are as a user you can’t touch that column you can’t touch that row of data well I think that I think that’s what we haven’t seen yet which is where

19:39 what we haven’t seen yet which is where which I’m I’m security I’m fine with but but then making recommendations like this is like well like why why would you recommend that if how do I do this later on I anyway oh interesting this could be a whole episode probably enough of itself I opened a can of worms sorry let’s get that with that a word yes exctly continued let’s get into our main topic today let’s talk about our main article that we have out here today that is the one

20:09 is the one around fabric changing the game with realtime analytics this is the article that we’re going to go through I’ll pull it up here and then I’ll share it via the the chat window Tommy give us a kind the the chat window Tommy give us a an intro of the article just give of an intro of the article just give a a quick summary of where’s Microsoft going with this what are they trying to do in this article it seems like so this is a lot of the services that used to be an Azure Internet of Things event stream that they’re bringing into fabric and very similar with a lot of the other tools however there’s it’s really become an end to-end real time solution

20:40 become an end to-end real time solution or it’s really getting close to that not only can I do realtime streaming my own custom app vent Hub event stream I can also use any visual from powerbi push that in utilize the K kql and use what is it custos or no yeah yeah the the fabric kql database right that’s database that’s their real time high high volume right I don’t know database I guess that’s one area that I haven’t played too much with and

21:12 that I haven’t played too much with and I’d be curious in the chat who else is played with kql or custo I’d be curious what are your use cases for more kql than I need to know already you do you’ve been playing with it yeah I I played around with it a few years or yeah a few years ago at least like trying to do statements like why can’t you just do SQL but that’s another exactly why is that a completely different language I have enough languages but the the fact in fabric is I have this whole end to end solution of being able to push data in yeah being able to then push it into different

21:42 able to then push it into different areas but I can do my own custom application I can push it into a kql all within fabric so I don’t have to leave the fabric interface at all to do this whole solution pretty neat and again there’s that integration with any powerbi report a powerbi report changes pushes through this event stream awesome Seth where do you see just getting some initial thoughts here real time analytics what’s your perception on it do you see use cases for this today do you have a

22:13 use cases for this today do you have a lot of business users asking for Real Time Time data what are your thoughts on the the real I I have a question for you first oh yeah sure based on the title of this Mike is this a game changer yes okay so let’s I think this is a fair is a fair question I don’t think this is going to change the game does qualify it doesn’t qualify for you as a game CH I don’t think this is a game AI for

22:43 think this is a game AI for wireframing that’s a GameChanger that that’s going to change how that’s going to change how I enjoy working on reports or not so that to me that’s a game changer bet and then I agree with that the convers so I’ll say this I like I like having things as fresh as possible in the data space I don’t like spending a lot of money to make that happen so I’m torn a little bit here right on one hand if I from a design

23:13 right on one hand if I from a design standpoint if I can always load the last 30 minutes of data all the time I’m going to load very small pieces of data so those little tiny jobs could run very quickly and I can continually keep my data things up to date cool on the other Hand That means every 30 minutes I’m doing I’m turning some compute on I’m using something and it feels like and this is just my feeling on this one it feels like Microsoft has basically taken a p skew and said hey we’re going to give you an F skew we’re going to change

23:43 give you an F skew we’re going to change all this thing called capacity units and we’re going to give you a thousand more backend systems that are going to consume more compute capacity so you’re not just going to have power data flows consuming backend compute we’re not going to give you a lake house we’re not going to give you a a warehouse we’re not going to give you custa we’re not going to give you Spark all these things consume more compute capacity for you to run them inside your PowerBack

24:13 you to run them inside your PowerBack ecosystem all that real time is doing and this is a line that I communicate with customers is real time just means you’re spending more money to get dat into your to make it real time it to me it’s like a curve right more real time more money it’s it goes together you’re you’re going to spend pushing yeah the faster you’re pushing data into a visual interface with corre using powerbi and Reporting yep yes so my my gut tells me like okay not a GameChanger

24:44 gut tells me like okay not a GameChanger very helpful to H so I would say this is probably a a nice Improvement for when so there are use cases for real time I have found use cases for real-time things to be very few and far between many companies need to make decision iions every day very few departments or use cases around or or much less not very few much less use cases around I need to make a decision based on data in the last minute two minutes three minutes it gets more relevant when

25:14 minutes it gets more relevant when you’re talk about twice a day once every hour right so those those use cases are probably more common so is it Game Changer no but I do think it’s very helpful to have all these tools bolted together so the ease of which you can turn this on that I do like yeah it is very simple to go from nothing to real time stuff with a couple clicks way harder to set up in in the infrastructure World in a yeah yeah to

25:44 infrastructure World in a yeah yeah to to answer your question like where I seen this or what what you use cases are there Enterprise AR lists out the one I’m I’m familiar with where organizations want to plug this into manufacturing lines or things that are very real time being made because un like getting alerts understanding like if something’s going off the rail something a product is needs to be refilled etc etc there’s a lot of like realtime use cases that those places have choosing

26:17 cases that those places have choosing powerbi as that solution I think is I haven’t seen a lot of and the reason is is even in some of the poc’s or half implementations that that I’ve been a part of the the the the trick that I the trick that it always runs into is there are limitations in the solution that are usually hit hit a threshold of the technology isn’t going to support everything that the customer

26:48 to support everything that the customer wants right because when it’s in real time what I’ve seen is the solutions are are good at just spitting data through yes and automatically like great customer’s happy but now customer wants to see this or customer wants to aggregate or include other metadata that’s going to make it more meaningful or add some calculation or the minute you start adding in the different layers that were familiar with as far as transforming data or merging with others is where it typically starts to not work

27:20 is where it typically starts to not work or you run into barriers of like yep you can’t do that because you’re you’re not able to create this type of calculation or you’re not going to be able to show that visualization in that way and I think that’s where it it usually falls off and people go okay well we’re going to go build a custom application or use a tool that we have full customization of because realistically the benefit the the largest benefit for them is just getting that streaming data and Microsoft does a great job of that so like half this I

27:51 great job of that so like half this I think is still very relevant from the standpoint that you have custo and kql and access to these events and I do think the idea itself of streaming some of this data to an output whatever that may be is valuable because but it’s dependent on like what is the importance of the the the thing you’re pulling from all of these events because ultimately like all of the underpinning of events that are happening in the

28:21 of events that are happening in the cloud right are are how you access this for kql whether this is straight up or logic apps or whatever the case may be because you want to monitor something that’s a huge cost driver or is extremely important that like an alert fires off right away you like an alert fires off right away if a certain usage or threshold know if a certain usage or threshold starts happening Etc so there’s a lot of relevant use cases I think for monitoring real-time things mostly in like triage areas or it or like I

28:54 like triage areas or it or like I said manufacturing or things that are happening real time but in terms of like using the end end in powerbi those are some of the limitations I’ve seen well and it’s interesting you say about the calculations because I think that’s where at least this solution really shines is if you already have the streaming data using the data activator using kql I can do those calculations but I’m definitely conflicted because both you’re saying really two sides the coin that I’m

29:25 really two sides the coin that I’m struggling with where it’s easy to turn

29:28 struggling with where it’s easy to turn on and there are heads Heads sure okay but it is Inc maybe next time Tommy it’s a good analogy sorry yeah the it’s really easy to turn on but these if you are trying to get any specific solution they are still that custom application there’s still a large learning curve this is not just a lak house and used data FL close or lak house and use some

29:59 data FL close or lak house and use some python there’s still a lot of other steps do I have to also integrate vream if I already have it and that to me is one of the big limitations right now if I already have the data well this is a phenomenal solution yes and yet I had to turn mine off when I was testing because I had in one workspace and 70% of my compute for really no streaming for the sample data was was was just this one activator product this one

30:29 this one activator product this one solution so yes and that’s a that’s a problem if I don’t need it though and I think I want to emphasize this point though if there is a use case like it like it manufacturing okay maybe you eat it but man like if you don’t have the necessity you’re GNA go why is this on yes it certainly is a cost driver for sure in some cases but the the other challenge to that is in a lot of the reporting that I’ve done is you you’re

31:00 reporting that I’ve done is you you’re streaming but then also streaming into storage which I think this the article outlines too like you can push this into a lake house and the reason you want to do that is because you want to see the historical something right like you want to start tracking and that’s where you’ve got to like plug off and have that streaming data land somewhere otherwise it’s just Vapor right going across the visualization it’s like Yay but oh no yeah show me that blip show me that blip well we didn’t St that yes exactly and that’s the nice thing I think with kql custo

31:31 the nice thing I think with kql custo right is it it has that back end but if you want to pull that like I said into other analytics areas typically it doesn’t have all the other metadata that you you need to plug onto it right so or available in the system so it can’t just be acoustic query it’s got to be you be acoustic query it’s got to be synced with something else that is know synced with something else that is more meaningful to the business and that’s where this is always in my in my experience always been a two-part solution you’re always pushing to another storage location where you can do traditional analytics on it and

32:03 do traditional analytics on it and produce those in the same report or you produce those in the same report or the ancillary report where oh we’ve know the ancillary report where oh we’ve got an alert we’ve we’ve got to go take action on something but then there’s the Retro part of it or things over time or how many errors have happened in the last week month year whatever the case may be and that’s where you do have to store that stuff I like what you’re going with this one Seth and I think there’s actually two if you think about the reporting need right there’s there’s a when you think about building the report there’s the audience right who’s the user and then we try and

32:34 right who’s the user and then we try and Link their action to visuals that are on the page and I think you make a great Point here because I think if I’m if I’m designing a report or reports around realtime data there is a another feature of that report that is do I need to make decisions on it right now or am I looking at more trending of that pattern over time and I don’t think you don’t I I don’t think you want to mix those two reports together you want to have a very defined report designed for this is realtime data that’s what I’m focusing

33:06 realtime data that’s what I’m focusing on on the flip side you should have another report either connected to a different data set or something else that’s describing more those longer term Trends because again the shape of the data that you’re getting from the streaming system isn’t always the right way of thinking about it for longer term reporting depending on how your data sources work the the the stream of data coming in might be making New Dimensions that you can’t handle inside that longer term reporting because you have to

33:36 term reporting because you have to dynamically adjust Dimension tables and star schemas so there is there is additional processing that may need to happen between the real-time data and the long-term data now you would want to blend those two reports inside an app so I could have hey here’s the realtime report of what’s happening right now here’s the trending report that’s more star schema and designed for changes over time stuff so I think those are two distinct use cases and by separating those into different reports I think you can still achieve

34:08 reports I think you can still achieve the same goal around reporting for that real-time data but now you can move more towards two distinct design patterns does that make sense it it does it does and the other the other part of this that my where my mind goes is like e even e even in real time scenarios do you do you really need real time or is 10 minutes okay or 15 right because yeah true like I could give you the full Suite of

34:38 I could give you the full Suite of everything capable and like that we can do if all we’re trimming is a thousand records every 15 minutes and just incrementally updating all the time right or dumping into a repository where we’re doing a direct query or something like that right so those Solutions are different or direct Lake right how yeah how fast do you like is some some actually is that is that monitor right always on somebody’s somebody’s just looking for the red Spike every once in a while or is it something that like oh

35:08 a while or is it something that like oh yeah they cycle through even well there’s really two products here right and I think the the the side that you’re saying the head side of this if we’ll keep using that analogy those are incredibly granular specific use cases where that is not just a want but it’s a necessity for the business you mentioned it healthare or hospitals where they’re going to need those the real-time data like it’s a like I said it’s much past the point of want it’s

35:39 it’s much past the point of want it’s going to be how the business operates that is so small in terms of the Industries or the companies or in terms of the normal use cases we’re going to deal with we’re probably not going to touch those scenarios it’s very small the other and but again for those Industries for those companies it is it’s an absolute need and unfortunately you either need to build your own custom application within activator or within the real time or you’re already using event stream or the event Hub

36:09 using event stream or the event Hub already the other side of this is and I don’t know how much we can get into but is the data activator with powerbi visuals themselves and that’s where maybe the it becomes a little more Broad in terms of the use cases here because I think everything we’re talking about that’s coming from an internet things or coming from an external Source those use cases are set up because again that company’s relying on that real-time data we’re just not going to see a lot of those or we’re going to see a few use cases but it’s gonna be just those

36:41 cases but it’s gonna be just those Industries so for those of our fans that like to see the intensity of Seth come out I have a question let’s see some intensity Seth well this episode apparently is rankling me a bit so so the other part that that I notice in this this fabric implementation and and walkal for live real time analytics yep is where if we’re going to take one of the

37:12 where if we’re going to take one of the best ways we can visualize data in a reporting in Microsoft ecosystems and we’re going to make it part of part of fabric why does this not end with power BS BS this is a great question why doesn’t it just land in a report yes but like if we’re going to visualize this who who’s going to take real time data and then like visualize it in this way you’re not

37:42 like visualize it in this way you’re not chart like you’re not going to do all this so that somebody can log into a lake housee like and and look at a notebook and click execute kql query and and look at my real time data this this like this is a pattern that we’ve seen rankles me a bit Yeah you got the full fabric ecosystem here we have like hey all the way from the guts we’re gonna create a solution and it’s and it’s like and hit run where’s my tool where’s

38:15 like and hit run where’s my tool where’s my tool where’s my tool the tool that we all know and love that we’re talking about the complications of like so here’s my question to you guys so so this is this is a demo of real time dat time dat yep couldn’t you also at the same time here though if they’re streaming into this this Lakehouse aren’t we just pumping that into a Delta table what I think happening can’t can’t we just connect directly to that and read it direct it and have an auto build a report even even it’s like a couple clicks

38:45 even it’s like a couple clicks then add that to the end this just goes to to kql but isn’t kql accessible yes so you have to create another integration to push it no I’m saying direct like it power direct L to the Delta table doesn’t that work wouldn’t it do I need to hit Refresh on the report page for to like okay the question is in the kql database does that write things down is that is that actually doing Delta so this is what I don’t know this is where I have to like spend some more time on

39:15 I have to like spend some more time on this one even even in the demo it’s

39:17 this one even even in the demo it’s there there’s a lakeh house component you can write the kqo what I’m saying is write it into the Delta table gotcha and then just automatically directly to the Delta table and read it from your report or do we have that odd issue where it’s like I need to refresh the browser in order for powerbi to go get the the re most recent data as opposed to push where we know even in the past powerbi has had this functionality since forever because it was one of the coolest demo parts right I’m pushing data through the

39:49 parts right I’m pushing data through the report so that it automatically is changing the visualizations as they’re on the dashboard and those are two different things it’s a a poll versus a push yes I I will say this I have done the demo so there there’s a couple real so I didn’t do the kql piece of things I did the demo around streaming data from stock market stuff so there’s basically you can do like a a streaming analytics or syap streaming analytics I think they just named it streaming analytics now I’ve done the data size thing that you this this is whoops whoops so so I’m

40:21 this this is whoops whoops so so I’m going to get I’m going to throw some wind to the c or caution to the wind here and just go through like well maybe not the win but I’m going to give you some cautionary tale of me playing with some of this this is why you should play with this on a fabric skew that is a trial because you may not know what it’s doing so you really need to understand the implications of this so to to give you some more context I went in there I turned on the streaming real time analytics it was very easy to do so I said grab these stock prices and so every second it was

40:51 stock prices and so every second it was grabbing all the current stock prices and writing them down in all great IT projects you say this is cool and you let it run for a couple hours you go away and you do something you come back oh yeah this is really interesting and then you go home for the weekend and you forget you left it on and this is happens all the time and so that’s never I realized this after about two weeks of collecting data and came back to it and said oh I think that I think that streaming thing is still running I wonder if it’s still going and I go back and look at it yep every second for the

41:22 and look at it yep every second for the last two weeks it’s been grabbing lots of data and when I came back to to again it was collecting the data I could see the stock prices it was very simple and it even made a Delta table for me which was very easy just to tap into I could run I could build a Power report on top of it it was direct laking no problem everything ran really smooth so the experience was great from a designer standpoint behind the scenes what I didn’t know is every time it’s writing those tables it was creating a whole

41:52 those tables it was creating a whole bunch of more parquet files in the Delta table so I had 53, 000 little tiny parquet files sitting in my Lake housee none of it was being optimized it wasn’t being compacted it wasn’t getting more efficient and I was able to I had to I basically stop the job because I figured it was running and then I wanted to delete the data I had over 30 gigabytes of data stored there just by turning a couple buttons on and so you don’t if you don’t know what

42:22 so you don’t if you don’t know what you’re doing you’re going to eat up a lot of storage space which is not bad I lot of storage space which is not bad 30 gabyt you can store a gig for a mean 30 gabyt you can store a gig for a year for 21 cents so it’s it’s darn cheap to keep the data around for a long period of time however if you’re doing this you could easily eat up terabytes of data just by storing things that are not even being used in the final analysis so yeah that’s a that’s and that’s a really good point too from just because you’re interacting with these objects like tables it’s not

42:54 these objects like tables it’s not necessarily cleaning itself up yet tables right so you’re not inserting a new row into record get that out of your head it’s typically going to be files individual files doing like this little little dinky stuff and that is there are absolutely things you need to do to optimize in vacuum and clean up th those Arenas otherwise that Delta table is going to slow down way exponentially you’re going to have to your other point massive amounts of volume of the tiny

43:25 massive amounts of volume of the tiny files and little files and stuff just yeah so I did I did a couple and this is if you listen like I don’t know two months ago in my complaining inside the the podcast I was complaining like wow there’s this thing called a vacuum step and it doesn’t do it and you don’t optimize Believe It or Not There Is Now inside the fabric lak houses there’s now a right click on all the tables that are deltas you can say optimize and it runs there’s now a button that was not me I did not I did not I just complained about it it just appeared so sure take it as you will take it as you will

43:56 it as you will take it as you will it was hard so it was hard for me to understand what was being written and a lot of the tools like data flows data flows gen two will write Delta tables but there is no optimizing there is no cleaning up of that process based on what I understand today yeah on how that that system works so there is other processes you need to put in place here to clean and groom out the old files you don’t need inside these Delta tables so again there’s this is one one area that I’m like very passionate about is just understand how the technology works

44:26 understand how the technology works because that has implications on your design and this is very different than what we’ve been doing in SQL so we have to think about things new and just you need to understand what it’s doing because the Delta table is an immutable object meaning you can never change it you can only add new files that describe the final state of table data so understanding that changes how you design the system sorry I didn’t mean to use big words

44:56 words Michael you’re being so IM I don’t know where that came from it just popped out just incomprehensible I had to Google that when I heard that word the first time I was like I had a what the heck does immutable mean what are we talking about immutable I don’t understand you can’t you’re an immutable you’re immutable I think there’s a there’s a more business friendly interface though I don’t know how much you actually played with reflex and data activator yet and that’s on the same stream here because I think we’re kind stream here because I think we’re I think we’re all in the same opinion of I think we’re all in the same opinion at least with realtime analytics and kql where all right it’s neat it’s cool

45:28 kql where all right it’s neat it’s cool still a very specific use case that I don’t think any of us are seeing but there’s that and again there’s also a very large learning curve or skill curve if you want that specific solution that you’re looking for unless you already have an event stream data activator is a little different and I think there’s may be a little more use cases here it’s almost like the upgrade version of the realtime streaming data set in set in powerbi where I can actually create events push it either from a powerbi

46:00 events push it either from a powerbi visual or I can actually push it from Power automate or from any other custom application they make that a lot more seamless and also the the time from creation to actually having some data coming in is a lot less I don’t know if that’s a more broad use case where we can have some like event streaming but yeah to me that’s where I I think I see more of our use case is of more where it’s going to be relatable to us I like your I like

46:30 relatable to us I like your I like your point there Tommy because I think another very solid use case is I’m working on Excel files I think this is not Sol case this happens I’m not sure I’m recommending it but the idea is I’m working on Excel files and SharePoint I’m going to hit save and that file will change I want to go grab a table of data that came out of that file and immediately land it to something that’s like a lake a a state and time of that information right so to your point yeah someone can edit the F but it’s like random it’s like I’m going to

47:00 it’s like random it’s like I’m going to edit it twice three times in one day and then I’m going to leave it alone and walk away and I’ll come back and add some more things to it so I don’t need it to be always updated but it would be really nice to say hey there is this table inside this Excel document every time this Excel document changes I want to go grab a copy of that and immediately load it to my table and then process my Downstream tables to make my data set update right this these are things these are the simple things like yeah yeah like think of it in your semantic model right there there’s missing dimensions

47:30 right there there’s missing dimensions in a dimension table and business users are updating something in Excel and they only need to update that one dimension table based on the information that’s inside that Excel file what’s the fastest easiest way to do that know what you’re making me think here hold on you’re blowing my mind hold on if if you use direct Lake direct Lake doesn’t refresh the entire data set it just looks it’s a semantic model that looks at the tables beneath it so if you were doing something like that you could data

48:01 doing something like that you could data activator your Excel sheet into the Delta table that describes the dimension and so you could update the dimension table independently of your fact table because it’s I think the semantic link and or the the semantic model and the direct link or direct Lake feature you it’s just looking at what the data is already it’s like doing direct query to the lake basically so in that case you could update individual

48:31 that case you could update individual tables without having to actually update the entire L tables in the model I never thought of that that’s a really good Goose case there so it wasn’t initially where I was going but I like the the the road you’re going but have you tried at Le yeah so one of the use cases is be able to push the to your to direct L but have you tried the data activ on a visual I have not I don’t know I don’t really know how that was work so that’s one thing I haven’t had time to demo so what’s your opinion

49:02 had time to demo so what’s your opinion on that one Tommy because it sounds like you have done a little demo around that so this is actually something that I’m I

49:08 so this is actually something that I’m I want to explore even more but just for myself what you can do is anytime a visual changes on any report and this is what used to be alerts again this is like the upgrade of the data set streaming and alerts and powerbi if a visual changes send an event to reflex or data activator okay yes I track all those events and with data activator rather than taking literally everything from a visual or like we do realtime analytics it’s like we’re gonna just in ingest everything that’s coming from

49:38 ingest everything that’s coming from that Source yes I can create specific events in a reflex so I can say hey when my cap when this card or bar chart changes send an event with these properties also if someone updates Excel file or Updates this application push from Power automate send it another event with these properties I can track when all these things are are changed affected now again right now that’s not a i you want to call it a scalable process because you have to use power automate which I will say I love power

50:09 automate which I will say I love power automate and there’s entire organizations I know that rely on it yeah where you can add this as one of the actions to say now let’s track those changes this to me is much more broader implication and also probably more use cases for organizations who there may be not relying on real-time analytics yeah but they do want to be able to see and affect these changes on a on a closer scale see but so there’s there’s two parts of this that I I think are then missing in in an article like this for

50:40 missing in in an article like this for for future recommendation like the use case what’s the use case all all we have is like hey you can you can track something going across the sky and we even reference data activator and a reflex to say like hey you can set alerts why would I want to set alert you could like make it meaningful you can set an alert with your location in time so the data activator is going to shoot something off to you that says in 10 minutes the Isis is going to be overhead so you can see it screaming across the sky oh my gosh that would be awesome

51:10 sky oh my gosh that would be awesome yeah I also think you bring up a good use case because data activator you’re talking about the satellite in the sky not the bad people okays I know this article is about I know fly I was just confirming you were talking about a space station sorry with some problem the other hon the other part about this that that I think could have been taken next step is like hey for real time dashboarding and Reporting

51:42 real time dashboarding and Reporting yeah do you really want that just pumping live data across the screen or in reality are you looking for things because if you’re looking for things I don’t need to give you that I can plug in data activator and I could say I I see this anomaly and as I see this alom what you need to go is go refresh a report or go log in and get the information that is starting to be collected around the events that are happening right now that you’ve identified as a trigger that’s what’s helpful right the triage of a problem

52:12 helpful right the triage of a problem that we’ve identified or that like we’re setting up things and you do that with like I think even in here it’s like they’re referencing logic apps and things like that there are systems that allow us to like fire off alert Azure full of them right data activators is the business case of that hey I have a bunch of these things going on I want to start tracking them shoot me some sort start tracking them shoot me some notification based on what I’m of notification based on what I’m telling you to to let me know whether or not I have a problem well that that’s

52:42 not I have a problem well that that’s that’s a lot more valuable than a screen of streaming data because what if somebody doesn’t see it right and and I think again this is the scalable or this is the upgrade version because if I there’s a way to do this before however it was a lot of steps for one alert I had to create a kpi card put in a dashboard set the alert yes and then I’ll let that go but I can create one reflex or one data activator and I can create events from all my reports all in one in a sense

53:12 all my reports all in one in a sense product so all these events coming in could come from any of my reports it’s not like I have to create another reflex and another product for each you don’t you don’t need to create another reflex well The Reflex is product and you have events in it so I just created an event per action so to speak so yeah you’re creating something but it’s not like I 18 artifacts in my freaking workspace what what I’m driving at is like I don’t have to create another workspace I there’s four workes

53:42 another workspace I there’s four workes you should have another for that wondering I was wondering if I needed a multiple reflexes within a data activator oh each reflex with my multiple workspaces yeah reflexes okay yeah totally clear things react to the refle that’s the workspace kind every every action should have its own workspace you should have at least at least thousand because some actions some actions shouldn’t be visible to other people exactly right work 100% you don’t

54:13 people exactly right work 100% you don’t want that would be inappropriate have all actions shown to everyone you have three for the kql one for the K this is the kind this is the type of conversation that would happen if somebody said like hey where is that thing happening it’s like oh no it’s not it’s not in this workspace it’s happening everywhere no is are you sure this is not the activity you’re looking for Works what’s the solution to this millions of dollars in purview

54:43 yep not wrong not wrong how do I find anything we don’t know that’s amazing I like it I like it a lot all right so that’s awesome so I think we’re at a good time to do final thoughts on this on on our topic today so final thoughts around real time events and how Fabric’s changing the game around Real Time Event Tommy let’s go with your final thoughts if you’re already using real time and streaming or you see your organization that had a

55:14 or you see your organization that had a need for it if you’re already using event stream or you’re already using the event Hub there’s a possible really great integration here if there’s more business cases where it’s one off look at data activator look at reflex you don’t have to go through the whole we got to build our custom application I think there’s definitely more than one use case with reflex that you can provide prove or provide impact for your team using activator Seth any final thoughts on how you How do you see this

55:45 thoughts on how you How do you see this love it I love live streaming events no I I think I think what people should know is it’s very accessible especially in Cloud ecosystems like Azure it’s fantastic like like so much wealth of information related to Eventing that you may be very interested in and it’s very accessible so that’s fantastic getting into a location for you to query and or shoot into alerting systems even easier now with data activator so I I love the fact that it’s it’s even

56:16 I I love the fact that it’s it’s even much more accessible and the solutions are easier to implement make sure it’s it’s a worthwhile use case though right to your points Mike you’re gonna you could potentially create a lot of storage which initially isn’t going to be a big deal but over time you’ll you’d want to recognize and make sure you have cleanup activities or or things like that behind the scenes to to really manage how much you’re actually storing on the front end yeah you can use powerbi there’s multiple different solutions for

56:46 there’s multiple different solutions for visualization of things make sure it’s it’s applicable to the the business case that you need I don’t see a ton of Mo out there but the fact that we we have the cap cap abilities to see this data as fast as we can and there are different choices I I think is is pretty cool I think my final observation here is what I have alluded to in the past around this is faster data more money you’re G you’re going to need it just more more more data volume more money you have to be smart about how you are going to consume the information and

57:18 are going to consume the information and store it in a way that makes sense I and I also think the other point here around splitting your real time analytics away from your long running historical analytics I think is a a smart decision try to hone in on one use case as instead instead of trying to build you instead instead of trying to build both use cases all together in one know both use cases all together in one in one solution all right with that I say thank you very much for listening to the podcast if you found this enjoyable or even got some insights out of this we’d love it for you to share with somebody else we don’t spend a dime on marketing this thing so it’s all it’s

57:48 marketing this thing so it’s all it’s all on you guys to to push it out there and let other people know you found value from this podcast we appreciate your time we hope you had a good run on your treadmill or your walk or whatever you’re doing right now so for those of you who listen purely online and don’t watch the Youtube video we still like you as well we appreciate all your listenership thank you so much please let somebody else know you found value in this episode and share it with somebody else Tommy where else can you find the podcast you can find us on Apple Spotify or wherever get your podcast make sure to subscribe and leave a rating helps us out a ton you have a

58:18 a rating helps us out a ton you have a question idea or topic that you want us to talk about in a future episode head over to powerbi. com leave your name and a great question finally join us live every Tuesday and Thursday a. m. Central and join the conversation on all of powerbi tips social media channels thanks guys so much appreciate it I’ll see you next

58:55 time [Music]

Thank You

Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.

Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.

Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.

Previous

New M Functions for Fabric – Ep. 273

More Posts

Mar 4, 2026

AI-Assisted TMDL Workflow & Hot Reload – Ep. 507

Mike and Tommy explore AI-assisted TMDL workflows and the hot reload experience for faster Power BI development. They also cover the new programmatic Power Query API and the GA release of the input slicer.

Feb 27, 2026

Filter Overload – Ep. 506

Mike and Tommy dive into the February 2026 feature updates for Power BI and Fabric, with a deep focus on the new input slicer going GA and what it means for report filtering. The conversation gets into filter overload — when too many slicers and options hurt more than they help.

Feb 25, 2026

Excel vs. Field Parameters – Ep. 505

Mike and Tommy debate the implications of AI on app development and data platforms, then tackle a mailbag question on whether field parameters hinder Excel compatibility in semantic models. They explore building AI-ready models and the future of report design beyond Power BI-specific features.