PowerBI.tips

Semantic Link Labs Updates & Scenarios – Ep. 397

February 12, 2025 By Mike Carlo , Tommy Puglia
Semantic Link Labs Updates & Scenarios – Ep. 397

Mike and Tommy break down what’s new in Semantic Link Labs and why it’s becoming a go-to toolkit for automating Fabric and semantic model workflows with notebooks. They share practical scenarios—from incremental refresh policy updates to operational monitoring—so you can move faster while keeping governance in mind.

News & Announcements

  • Introducing template dashboards for Workspace Monitoring — Microsoft shared community-built, open-source template reports (Power BI and real-time dashboards) designed to plug into Fabric Workspace Monitoring. The goal is faster troubleshooting by starting from pre-built patterns for things like long-running queries, refresh operations, and ingestion delays instead of building every monitoring report from scratch.

  • Private Preview of Migration assistant for Fabric Data Warehouse — The Fabric team opened a private preview of a migration assistant intended to accelerate moving to Fabric Data Warehouse from sources like SQL Server and Synapse dedicated SQL pools. It focuses on converting schema/code, supporting data migration, and adding AI-assisted guidance—worth tracking if you have legacy warehouse workloads you’re planning to modernize.

  • Introducing ownership takeover for Fabric items — Fabric now supports “take over” (ownership takeover) for many non-Power BI Fabric items when the original owner leaves, loses access, or credentials expire. This is a key business-continuity feature for keeping pipelines, lakehouses, and endpoints functioning without needing to rebuild assets from scratch.

  • Submit a topic idea / mailbag — Have a question you want covered on the show? Drop it in the mailbag form—episodes are best when they start with real-world scenarios.

  • Subscribe to the podcast — One hub page to catch the live stream and find Spotify/Apple links to listen later.

  • Tips+ Theme Generator — Generate consistent Power BI themes quickly so your team can stop hand-tweaking colors and fonts across reports.

Semantic Link Labs is quickly becoming a “power user’s automation layer” for Fabric and Power BI semantic models—especially if you’re already living in notebooks.

In this episode, Mike and Tommy talk through the practical reality of where Fabric is heading:

  • More operations are moving into the service and APIs. If you can script it, you can standardize it.
  • Notebooks are the control plane. Not just for data engineering, but for managing semantic models, governance workflows, and operational hygiene.

At a high level, Semantic Link Labs is a community-driven toolkit (published by Microsoft) that extends Semantic Link patterns—helping you automate tasks that historically required a mix of manual steps, Desktop work, or niche tools.

A few themes that come up repeatedly:

  • Repeatability: turn “tribal knowledge” model operations into code.
  • Scale: apply the same patterns across many workspaces/models.
  • Governance: make “safe defaults” the easiest defaults.

Scenario 1: Automating incremental refresh policy changes

One of the standout real-world scenarios discussed is updating incremental refresh policies as time moves forward—especially when your business logic doesn’t match the out-of-the-box policy behavior.

A concrete example is covered here:

Scenario 2: Keeping up with the rapid release cadence

Semantic Link Labs is moving fast, and tracking releases is part of using it effectively:

  • Releases · microsoft/semantic-link-labs — New and updated functions are frequently added across areas like semantic model operations, Direct Lake utilities, and broader Fabric admin/ops scenarios. If you’re building notebook-based automation, skimming the release notes regularly can surface new capabilities you can immediately fold into your standard workflows.

Practical takeaway: treat notebook automation like production code

Mike and Tommy emphasize a pattern that more teams are going to need as Fabric expands:

  • Put notebook-driven admin/model changes behind process (reviews, owners, environments).
  • Prefer repeatable scripts over “clickops.”
  • Build guardrails early so “self-serve” doesn’t turn into “wild west.”

Looking Forward

Semantic Link Labs is a signal of where the Fabric ecosystem is going: more programmatic management, more automation, and more “data + BI + ops” converging in one platform. If you invest a little time now in notebook-based workflows (and governance around them), you’ll be in a much better place as your tenant, capacity, and semantic model estate grows.

Episode Transcript

0:32 good morning and welcome back to the explicit measures podcast with Tommy and Mike good morning everyone welcome back good morning Mike how was your weekend it went very quickly been very busy with software and Building Things and lots of projects going on but I took a little bit of time to relax I have for whatever reason reinvigorated a nice good old cold or flu or something on me oh no I am just feeling congested so I did a lot of lay and low a lot of we are doing a larger kitchen

1:02 we are doing a larger kitchen remodel in our in our kitchen doing some refacing we’re leaving the cabinets alone but we’re doing new countertops some big things that my wife has won and so it’s been very exciting to see that come along but needless to say we’ve been eating out a lot because we don’t have a kitchen there’s no sink it’s it’s out of it’s out of commission right now for a little bit dude is it a we’re doing a full revamp or we’re just doing like cabinets like a little little I gu you you would call it like refacing the cabinets so the cabinets are getting refaced so they’re

1:33 cabinets are getting refaced so they’re and all new doors and handles we are adding some pull out drawers inside the cabinets and then new countertops so we’re not like going down to like studs we’re not going down to like like drywall we’re just using what’s already there the the structure is not changing there’s nothing we’re not adding new cabinets we’re not moving things around it’s it’s all the same stuff so okay needless to say you can’t use it well it’s been nice because while we’ve been having worked done on the cabinets we’ve been able to use the use the kitchen but now we just we’re at the

2:05 kitchen but now we just we’re at the very end and at the very end they have to take out the plumbing they put new countertops in and that’s when you lose like the sink so hopefully today we get the use of our sink back and we can start reusing her kitchen we got now the hard decision what backsplash do you want oh you should call my wife she was in a previous life a interior designer oh excellent what she and she married me the non interor designer no no colors are are

2:36 interor designer no no colors are are not a thing did that like I think a lot of people depending on the colors that you learned those are the colors that you actually see so my wife tested me out on this and she’s like what color is it I’m like well you didn’t learn you didn’t learn mauve I know the Rainbo and I know that variations teal yes slightly more teal yeah but very teal yeah I think I think my wife has 50 colors she knows wow and

3:07 my wife has 50 colors she knows wow and so like I was looking at something oh it’s blue she like no it’s acrylic da d d du yeah like I couldn’t see it but I also never learn what th those colors are so it’s amazing at least for an interior designer interesting I I don’t know how to I don’t I’m not sure if I would be able to pick out or describe multiple colors for people I could I could probably tell you like what some general colors are to your point Tommy but I couldn’t be all all these nuanced colors like green you

3:39 all these nuanced colors like green you know seafoam green all these other like I wouldn’t know I would not know these things I would just yeah it’s I’m very simple in that regard it’s interesting I’m not good at yeah I’m not at making designs but I’m really good at like looking at other things and being able to say oh I like that oh this is a good design so I’ll I’ll tie this back here to powerbi a little bit when I look at colors or color pettes I’m probably not the best on knowing how to create something from scratch but I can do a

4:11 something from scratch but I can do a really dang good job of going into figma and finding a different report that I like and copying it or bringing down the colors or like building a style that looks similar to that object and that’s what a lot of what I do when I build reports I’ll go find an image a picture or someone’s mockup of a dashboard and I’ll say oh that looks interesting and then I’ll go Riff on that design make it my own and then build a lot of extra features and pages and backgrounds for it so a lot of times I’ll do that I’ll take inspiration from other other

4:42 I’ll take inspiration from other other people’s works I think it’s like the phrase phrase great artists no what is it great artists borrow or something like that or steal I don’t remember that phrase is great good artist borrow great artist steal or copy that’s yeah I think Ste job said it so can you articulate oh great arst steel is I think is the the phrase well at least

5:12 think is the the phrase well at least that’s what Chad gpg says can you articulate when you see something like what you like about it or is it more the I just like that where even when it comes to design or your kitchen oh good question yeah I would I’d say I can probably pick up on pieces of it I can articulate what pieces of the design that I really like and that’s usually what I spend my time on mimicking I’ll give you just a really weird I maybe history I guess previously I was well not previously when we first got married we didn’t have

5:42 when we first got married we didn’t have much monies and so year we wanted to do Christmas cards and so my wife would go online and she’d see all these Christmas cards and she like oh that’s an interesting Christmas card and she would show me the picture of the Christmas card and I took it as a challenge to go into like a computer program and like mimic that card like look at the details figure out how to make it replicate it and I would replicate these Christmas cards by making my own design and I would make it entirely out of like illustrator and then we’d go to the store and we’d print off a bunch of

6:12 store and we’d print off a bunch of pictures or or im images off of like Walmart or whatever and then we would set out a bunch of photos basically but it was of an image that I had designed now granted in the early days I had nothing but time it was just time computer building so I had there was not as many kids I could spend you was not as many kids I could spend four hours just trying to figure know four hours just trying to figure out how to get the design of the thing to work I’m Googling things I’m I’m learning about illustrator and Adobe products and maybe I’m using some photoshop in there a little bit too so

6:42 photoshop in there a little bit too so I’m doing all this work with this program just to make these good-looking images I think that helped a lot because now I can look at other images and know how they’re built it’s like a skill it’s a little bit it’s something you have to like learn how to do and this is I think one of the things that is difficult for people in powerbi is we’re really good at data modeling and bringing together the data structure I think the majority of the people building reports don’t have the eye or the visual or the aesthetic side of things and that’s where people need a

7:12 things and that’s where people need a little bit of help like and and you could could probably you could easily spend as much time doing the data modeling double that time and you could do all the report visual styling right you could e so where do you cut Corners we’ve had this discussion a long time ago of like what what’s the right balance of that inside you’re reporting anyways no no I I’m honestly the same thing for me the design side took a long time for me to actually so to speak understand what I was trying to do or actually put a

7:42 was trying to do or actually put a design together that someone make sense because I was analytical I didn’t know what I was supposed to build I just had things that I thought looked good but I think it books help the biggest thing that actually helped me was asking someone else who had no stake in those in the data like a friend or spouse like hey what is this telling you yeah so I like it awesome let’s Jo let’s before we jump into our news topic today

8:13 before we jump into our news topic today we just start a little rambling here at the beginning so we’ll we’ll stop rambling here for a moment today we’re going to talk about semantic link Labs what updates have recently been coming out this tool Is Amazing by the way this is incredible tool it does so many different things and then what are different scenarios or patterns on when you use semantic link labs to work inside your tenant now to be very clear semantic link Labs is a python Library based on another Library called Senpai which I had to really

8:43 Senpai which I had to really think about this was so it’s Senpai because semantic link semantic models right semantic models is the first part of it and then py as python so it’s like spark Pi spark right so it be Pi spark or or the other languages spark SQL right so those are the things that are relating to that Library the library semantic link or semantic yeah semantic link is directly built from

9:13 semantic link is directly built from microsofts maintained by them and allows you direct access to the semantic models in your in your tenant but does a lot more of that semantic link Labs extends this heavily and so we’re going to unpack a little bit more of semetic Link Labs where to use it where we’re finding with it right now and jumping in there that’ll be our for today all right Tommy give us some news so there’s a few things with that actually went under the radar that and because we didn’t talk about it didn’t mean no one read it but

9:43 about it didn’t mean no one read it but I think a few articles if we didn’t say it then not important not don’t talk about in the podcast did the feature actually get released that’s that’s what we’re talking talking about I’m actually surprised we haven’t talked about this so the first one I want to bring up is the template dashboard for workspace modering which I thought was huh maybe I don’t know why the immediate interest wasn’t there but these real time

10:14 but these real time dashboards work with your workspace monitor that you’re used to and they’re actually Community built they’re open sourced and they can download through GitHub and really what they help

10:26 through GitHub and really what they help you allow you to do is go through your environment and like basically the templates that we’re used to when it comes to monitoring that Microsoft does that we’d have to then like live connect if we wanted to modify this they’re all this community build and you can edit the desktop version and have it to your to your customization which is really neat so so this is this is a tool that Microsoft has built to enable users

10:56 that Microsoft has built to enable users to get better analytics about their workspaces and it’s using in a monitoring event house so this is streaming analytics directly into a collection an area or store basically and then from that store they’re giving you some reporting kind they’re giving you some reporting off the shelf out of the box tooling of off the shelf out of the box tooling inside their fabric toolbox on GitHub so fabric toolbox is the GitHub repo and I’ll make sure I put both of these links down here below in the in the chat windows for everyone to see how that works let me get those over here

11:28 that works let me get those over here to the chat window if you want go check those out we’d highly recommend it really good tools yeah be curious if anyone else is using these or seeing them one thing I will caution you here I believe my understanding this I haven’t played with it so I got to be I’ve read the documentation on it but I haven’t really gone into turning it on yet my understanding is this is the same thing as turning on like log analytics for your workspace so this could get very noisy or there could be a lot of of large volume of data coming over to your fabric workspace anytime you send more

12:00 fabric workspace anytime you send more things like this or turn on any real time Eventing you’re immediately going to consume more capacity workspace so just my big caution with this is you can turn this on it’s a good solution it is is streaming however just be cautious because when you turn it on it will send a lot of data potentially every interaction every query all the details from the workspace that’s happening will be sent directly to the stream good but I would probably recommend don’t turn it

12:30 I would probably recommend don’t turn it on and forget it I would definitely recommend turn it on and just watch it make sure you turn it off because it could potentially add a lot of costs or a a lot of additional compute units to your your skew so just be aware of that as as you use the tool so just a point out there that’s probably a really good thing because people might not know how much data is actually going in and yeah I thought it would be through the normal monitoring that you get with the admin but no

13:02 that you get with the admin but no interesting all it’s a different solution so they’re trying to be more transparent I was talking with oh shoot I forgot his name he works for Microsoft and he just did a you he just did our Milwaukee User Group area recently the name will come to me like literally halfway through the podcast but anyways we had the the PM speaking to all admin monitoring and going through the details there Tommy he spoke at your user group too oh Gil revie no after Gil that was the same

13:33 Gil revie no after Gil that was the same session it was after Gil was doing his oh our main speaker forgot his name Milwaukee I I remember used to work in Milwaukee tool sounds like I don’t even know anyways we’ll figure it out later I’ll go get the name Tommy will go look up his his user group session and he’ll go get the name anyways Tim Bendis there it is someone in the chat Tim Bendis is Microsoft employee and went through all the details of the monitoring app there’s a lot of things coming and Microsoft is trying to make

14:04 coming and Microsoft is trying to make the the entire modeling experience easier to understand thanks Dan while I stumbled around there for a bit appreciate it all right let’s move on to the next one what’s our next news item Tommy another just announcement I guess there’s really no feature coming out but it came out on January 29th the private preview private preview of everyone can’t get to it quite yet yeah of the migration assistant for fabric data warehouse so basically ancy and

14:36 warehouse so basically ancy and Microsoft were letting know that there would be a migration assistant last year they’re currently running a priview assistant or a private preview with the migration assistant they’re looking for participants and they’re look you can complete a form and you can actually get access to the Mig migration assistant and again that migration assistant is help accelerate your SQL Server syap other warehouses to fabric data warehouse and I think the idea here is

15:08 warehouse and I think the idea here is they’re they’re trying so the reason they go to private preview is they think they’ve got the tool or the migration solution complete but they’re testing out with the first couple customers so while we can’t necessarily always access these private previews just be mindful of that means usually behind the private preview is actually will release something fairly shortly that means we’re we’re at the very last stages of bug fixing feature refinement and then there will be a tool

15:38 refinement and then there will be a tool or a repo or GitHub or something that will tell you how to leverage this migration so I guess the point here is if you want to migrate SQL Server SQL dedicated pools or other warehouses to a fabric data warehouse you should now be able to soon go through a migration process for that so anyways there is a form on this page so if you would like to participate in this one you can apply to be part of the private preview it’s a form on the page that I just sent

16:08 it’s a form on the page that I just sent out in the chat window so check that out you can go see the form there if you would like to try and get involved there I’ll actually just see if I can link to the form directly in case you want to go try this out anything else Tommy one last thing that I’m surprised hasn’t been one of our major topics January 28th fabric update this is big I think this is the biggest one of the of the newest items here yeah ownership takeover for fabric items and what this feature is is really

16:41 items and what this feature is is really allows fabric users with the right permissions to take ownership at an individual fabric item and the since if you remember the same experience in data flows gen one because you couldn’t collaborate the only way you had actually take over to edit and so you can actually take over as a user in individual fabric item and then the users so basically some of the limitations there are does not support mirrored database items but this is

17:11 mirrored database items but this is pretty cool because we’ve talked about ownership and governance and security here in a workspace yes and I’m really excited to see this because I hope what I I first I hope it’s not the same experience as data flows gen one takeover which was just can I edit but at the same time this is a big deal because we want to have some some ownership of more than just a folder more than just a workspace on particular items now I don’t think this

17:44 particular items now I don’t think this umed modifies your ability to edit if you already could edit it like let’s say a particular item let’s say if I’m in a Lakehouse and I had permissions to write nothing nothing changes I’m just the owner of that set artifact artifact well this is the issue around like if someone leaves your organization and they own something like a Lakehouse so this is solving a direct problem where you had a lake house previously if I created it and moved on or let the company or

18:14 company or whatever that item would be linked to me it would be have the owner would be my name and in order to switch that ownership from me to Tommy it was requiring a help desk ticket you needed a help desk ticket to physically change the item The Lakehouse from my name to Tommy’s name this is like basic featuring right this is basic features you can’t you can’t let people create things and not be able to switch ownership of them especially if if all the items in the workspace are attached

18:44 the items in the workspace are attached to people one of the things I’d like to see here with this one I don’t know if they’ve talked about in this one specifically but I think they’re not talking about service principles or workspace principles yet no right so the one thing here to consider it’s in the limitation of this feature it says fabric item takeover does not cover ownership takeover as a service principle so if you have a service principle or a workspace identity those do not work with this takeover experience and there are some other

19:14 experience and there are some other things here that don’t support changes right mirrored Cosmos database mirrored SQL databases mirrored snowflake these are other features that are not quite there yet I would imagine eventually they’re going to close the gap on these things but yeah they have to this this was a big gap in my mind like I can’t tell you the number of people that had pain around trying to move lake houses between users and we were just stuck and we’re like the best thing to do is a help desk ticket that that’s absurd like this is very basic

19:44 that’s absurd like this is very basic function so it’s it’s good that they’re cleaning this up this is definitely a pain point of my own I’ve gone through a number of projects where this was quite painful I’m very happy to see this one being resolved yeah and I think there’s planned to have API support so you can actually just bulk edit and take over from a user which will be really nice right now everything’s just the UI still I’ll take it yeah I think the API stuff is going to be really good and apis I feel like are required for more of the continuous

20:14 required for more of the continuous integration continuous deployment the cicd type things where you’re going to need to programmatically like publish an item let’s say a lake house and you want to regularly publish who has access to that item because when you think about like continuous integration continuous deployment a lot of people think it’s deployment pipelines which it is it does some of that but there’s a lot of other things that you need in addition to that like who’s the owner of this item who’s the owner or members of the workspace did people accidentally add themselves to those workspaces and when

20:45 themselves to those workspaces and when you go redeploy the items you want to again update the permissions to the

20:49 again update the permissions to the workspace so that’s always consistent and you have the exact right access to the items in the workspace there’s a lot more things that can go on when you deploy things which the API will will be I think very useful for assisting with that okay that’s it for news anything else we should talk about I think that’s good from a news and you got anything else nope that’s good for me I’m that’s all the the fun things that I think I want to talk about for now all right let’s jump into our main topic today Tommy give us an overview we’re g to talk about semantic link Labs

21:21 we’re g to talk about semantic link Labs Let’s let’s get a little bit of an overview of what that tool is yeah so semantic link Labs is a python package part of Pi in fabric notebooks that really allows you to do a ton of Administration governance querying analytics on all of your fabric items there’s a huge parts of this with when it comes to Administration getting groups so all of your API calls that you normally do for those old school who had

21:51 normally do for those old school who had a Powershell script from ruy Romano that old package all those calls are available in this python package so I don’t have to worry about all those additional configurations but it does a ton more really any artifact that I want to actually go through like a semantic model and I wanted to pull a table or a column and make a data frame that I can if I want to get deployment pipeline information I can I

22:23 deployment pipeline information I can I it’s a really the full stack in terms of help me manage my fabric tenant I feel like this I feel like semantic link Labs requires a nice acronym acronym SL SL s semantic link Labs anyways that being said I wanted you check out so I’m put in a link in the in the description here semantic link Labs is an extension of the Senpai Library the Senpai library is semantic link which allows the python

22:55 is semantic link which allows the python notebook to connect or attach to semantic models semantic link Labs extends this semantic link Labs goes Way Beyond just the the semantic model it adds things like touching the best business practice analyzer or automation around that so BPA is a really great tool that’s something that came from tabular that’s an idea that came from tabular editor where you could make rules and and test rules about labs and libraries about your semantic model right this column name is incorrect

23:26 right this column name is incorrect there’s a relationship in this model that seems inappropriate right there’s a many to many relationship right it it can detect those things it touches other things like the vertac analyzer that’s now been integrated into the tool you can do other things check things for direct Lake figure out make sure your model is within the guard rails of what direct Lake can do or not do refresh clear your cache run Dax queries you can manage query scaling out on your different models visualize a

23:57 your different models visualize a refresh what happens during a refresh cycle you could do look at the measure dependency tree so basically you could pick a measure in the model and see all the other measures that depend on that measure inside a tree diagram it’s just amazing it has a whole bunch of reporting things and this is this is one of the parts of where I think this is why I’m so excited about semantic link Labs because it we’re talking about semantic model things but then when we transition over to semantic link Labs can do things in the report the report

24:28 can do things in the report the report has a best report analyzer report best practice analyzer and that runs also in semantic link Labs you can get report metadata you can view broken reports you can set a report theme migrate things about reports rebind reports this is stuff that doesn’t have any capability inside tblo editor tblo editor has no capability for you to do these things out of the box and Michael kosi is the the one who’s building this project it does a ton of other stuff around like capacities lake houses notebooks it talks to all of the apis

25:00 notebooks it talks to all of the apis for fabric so it talks to all the powerbi apis all the fabric apis all the Azure apis and now Microsoft graft apis and it also now supports service service principles for using it to talk to things this is like your Swiss army knife if it does everything yeah and I I believe Kurt we talked about this before with KT bu because he had an article about going through your entire report just using senpai like actually updating the theme updating

25:31 actually updating the theme updating visual Properties or updating broken visuals yeah has done demos of this and then Kur picked up on it and also written through and and did some talking about that as well but I talking about that as well but when you think about semantic link mean when you think about semantic link Labs you’re thinking only semantic model I want you to open your mind up and say this is way more has way more features than just the semantic model it’s bigger than that yeah and no exact one of the things to they have direct Lake migration for any of your models

26:02 migration for any of your models which is absolutely insane so a few other things just to make sure that we’re covering all the things that it can actually do this is just this python package it can you report metadata you said the report analyzer view measure the dependency tree visualize and refresh autogenerate descriptions or measures in bulk lake house is optimize lake house tables vacuum lakes lake house table migr Power BM PR capacity so it’s

26:34 migr Power BM PR capacity so it’s absolutely insane in regards to all the scenarios here it’s like you said it’s much more than just your powerbi model that we’re used to being the only thing we can actually do something it was the XML unpo back in the day which was the only was awesome but this really opens up all the fabric artifacts and also what you can do so I was talking with Michael kovalski recently about this exact tool and it’s you he’s explaining a little bit and he was also talking about this in a

27:06 and he was also talking about this in a an speaking engagement he did over in the fabric conference in I believe it was fabric conference over in Stockholm is where he was talking about this one he came over and talked about it but Michael kovos is on the cat team he’s the one who originated semantic link labs and while he was on the cat team he he deals directly with customer advisory team that’s what that is cat team is the customer advisory team so he deals directly with big customers or all customers and initially he built a couple like python libraries picked it up thought oh my gosh this is easy learned python basically he wasn’t

27:38 easy learned python basically he wasn’t really a python writer to begin with he knew programming but it wasn’t really his Forte learned Python and in about a year he had like started building all these helpful tools and producing a lot of things that was in like making things easier well now he’s like his full-time job is only like maintaining support Ing and building more features through semantic link Labs because by him solving a problem with this tool he’s able to solve not just one customer problem he’s solving like a multitude of

28:08 problem he’s solving like a multitude of customers problems all at the same time so one thing one thing I’m really excited about is when you go to the GitHub repo and look at all the releases the release list is incredible on this tool he is chunking out releases like multiple times per month so in 2014 he had four updates one per week in December in November he had three per week through December and he’s already had two in January and one in February I

28:38 had two in January and one in February I had two in January and one in February they’re making a ton of stuff in mean they’re making a ton of stuff in here that’s adding a bunch of really neat features so it’s almost hard to keep up with the tool because it’s moving so fast it’s doing so many things really really like it so and U I should we should give a shout out real shout out to our cheran friend Gilbert from for movies that’s actually where the idea to really dive into this a little more came from one of the cool scenarios that I saw was what Gilbert did was basically looked to automate

29:10 did was basically looked to automate updating incremental refresh policy and he was able to do that with the semantic link Labs which is insane yeah this but this is what this is the this is the stuff we’re talking about though like it’s it’s this stuff it’s like heavy automation so this is like for professional developers these are for users who are very very much in the weeds and the code of things and so well let me just

29:41 code of things and so well let me just I’m going to give you my perspective on this one there’s a couple things here the combination of semantic link labs in concert with kimle editor in concert with dax’s query view those three let’s call them tools I guess for like a better term those three tools replace every single I think almost every single other external tool that’s out there that’s talking about the semantic models I think these tools can do everything else everything other other tools can do and more and

30:11 other other tools can do and more and the bonus here is like as I compare this compared to tabal editor I don’t have to learn C just to write a scripting or an automation I can just write it in Python which in my opinion is much easier to write so here’s a question because you actually brought up something very intriguing because if you think about all of the current API or developer ways to modify read your fabric tenant you talk about xmla end point power BRS apis

30:42 talk about xmla end point power BRS apis the and then now there’s really Senpai Labs if you are the data Zar or an organ organization do you care how people are actually if you if let’s say let’s take it first with the heavy developer the people who actually do need to automate their powerbi items or fabric items using some again custom endpoint are you going to would you push everyone to only use the SE semantic

31:13 everyone to only use the SE semantic link Labs or

31:15 link Labs or Senpai compared to API xmla endpoint Powershell I think those are the really three ways that you can actually in a sense connect or talk to powerbi well let’s also talk about other external tools right there’s D Studio yeah also tabular editor right so let’s look at like so I’m going to look at all the tools and say okay which ones provide the highest level of automation right Tabler editor and semantic link Labs provide high levels of automation I

31:45 Labs provide high levels of automation I have really struggled just me personally I’ve tried to get Tabler Editor to to run in some deployment pipeline where the deployment pipeline runs something and Tabler editor like recompiles the code or deploys something it’s difficult and I couldn’t get it to work correctly and the documentation is okay but tabular editor wasn’t really the right tool like a CLI command line interface right I want a CLI to be able to say look yeah this is where my model lives I want to package my model and deploy it to this workspace well it

32:16 and deploy it to this workspace well it doesn’t it it works but you have to be like really in the weeds of things it’s just more difficult I’d rather I’d rather write a python notebook and use that to make a sequence of cells that does what I want so if I think about like so that’s one difference there right I’ve already ran into other problems with other tools that is difficult to write code for or it’s different difficult to write command line interface type actions right the centic L Labs makes this much easier also I’m writing

32:47 makes this much easier also I’m writing just pure Python and the functions there are way more functions in centic link Labs like refresh your semantic model optimize things get this measure run this SQL query it just there’s just a lot of like nice features around it the functions are easy to understand it’s easy to run those it just makes a lot more sense to me so now let’s compare the pricing of all these tools right so Tabo editor if you’re going to go professional with this one it’s going to cost you a little bit of money to get the tool it is a premium level tool it is for the developer when you need to

33:17 is for the developer when you need to start manipulating like partitions and going deep on the tables and the models get really large you’re not going to want to edit a model like using desktop you’re just not going to want to there’s thing it’s going to you’re going to outgrow desktop very quickly now when I have semantic link so now that I have timle editor and now that I have Dax query viw and semantic link Labs these are all free tools doesn’t cost me a dime now then I might be writing a bit more code I might have to write a bit more cells inside my python notebook in some inic

33:48 inside my python notebook in some inic link Labs but I’m willing to learn that in exchange for free right so like it may not be like the most optimized tool out there but there’s I don’t have to learn c i it’s a free tool and I can do way more things not only just at the semantic model level but also at the report level to me I’m sold and when I was watching Michael kovalski talk about this he did a demo where he went to I think it was I don’t know if it was a lakeh house I can’t remember exactly

34:18 lakeh house I can’t remember exactly what he was doing he went somewhere to a lake house he read the metadata from the lake house he then created his own tables I think it was his best business sorry let me let me step back here yeah he was using best business practice analyzer BPA he went to a workspace he said this is my workspace run the BPA tool on every model in a workspace he ran every model he got all the rules back he wrote the tables to a lakeh house once the tables were written to

34:49 house once the tables were written to the lake house he made a semantic model programmatically albeit it was he made it from scratch using code and then he made the report programmatically he connected the semantic model to the tables he connected the report to the semantic model and everything just worked and he had a from a single notebook he had run all the tools for best practice analyzer made the semantic model and made the report all within one a couple lines of code that that’s an automation like that can’t be done in

35:20 automation like that can’t be done in any other tool and I looked at that and I said this single tool cemented link Labs just killed every single external tool that’s out there de with semantic models and even some lightweight report editing stuff as well this is incredible it does so many things yeah my Powershell scripts had their day in the sun back in the day it could do some crazy things but it’s it’s interesting with the cementing link Labs the official documentation because is

35:50 the official documentation because is obviously coming from cementing link which is the actual package and Microsoft actually developed then the the official documentation actually says that the cementing link is a feature to establish connection between models and data science yes right so and it’s funny that it was only really initially built for just the data science side because they’re thinking hey python notebooks Jupiter notebooks that’s data science that’s GNA really just be that data science spere cool awesome so

36:23 data science spere cool awesome so that’s just that’s just a semantic model that’s just semantic that’s Senpai that’s all that that is the labs part of this makes it really rich because now this is including like a whole bunch of other things at like it has a ton of administration level features in it that’s awesome like it’s just absolutely incredible and that for me I’m like oh yeah so again if if you’re a new user you’re thinking like what should I be learning now you guys are talking about timle view you’re talking about Dax Dax query view you’re talking now about semantic link Labs dude if you have Fabric in your tenant 100% you should be learning

36:56 tenant 100% you should be learning semantic link labs immediately it will make your life way easier now it’s very code Centric it’s very detailed but if you want to be work smarter not harder it this is the tool for you so man this interesting because yeah I think we still know and I’m still on the the side that most data professionals or powerbi professionals probably hav’t by now maybe have you probably hav’t by now maybe have like open a Jupiter notebook or

37:27 know like open a Jupiter notebook or done a notebook but that’s probably not been their Forte right so if I’m if there if we’re introducing fabric to my organization let’s say yep and up until this point the business intelligence team has been powerbi all powerbi okay H much more likely to been doing like to take ruy Romano’s Power Cell script to actually do some automation to get workspace modit during an activity than I ever were was to touch python or spark so if you

38:02 to touch python or spark so if you are with that in mind again data Engineers people in data science definitely obviously this is like well at home this is nothing new but a lot of if you’re a data scientist You’re Gonna Know Pyon like that’s gonna be a given if you’re a data engineer you may or may not know python you probably will know like I think you would know more of like a sequel right you’d be more of like I think of dat a little bit of DBA but a lot more data engineering there’s other tools that are a lot more

38:32 other tools that are a lot more graphical interface so you you don’t have to actually know actual code you could use things like talend or other data engineering type programs that do all the work for you but I’d argue like python is becoming a very common language any of the new computer science Majors that I’m talking to nowadays they all know it they all know python coming out they’re not saying you have to buer computer science major to do this stuff but like it’s taught everywhere it’s G be it’s going to be like the language that everyone knows so let me ask you who are you teaching then

39:02 let me ask you who are you teaching then if you had a team of just powerbi right because we’re still in this migration period we’re still introducing fabric to an organization that might have previously be just powerbi if I’m listening to this podcast right now and I still don’t have fabric but we’re we’re potentially going to introduce it at my it at my organization and I’ve been the powerbi pro and Dax and power query yada yada yada are you how long is it going to take for you to expect someone in that

39:32 take for you to expect someone in that space that Persona to start using spenting Li Labs say that again when do when do you want to you’re asking like when’s the right time to use yeah so if I’m at an organization that we’re maybe about to introduce fabric don’t have it I’m listening to this podcast from the day that my organization opens up the fabric to the organization how long or what would you expect that person to begin to start

40:03 you expect that person to begin to start using L Labs yeah this is a good question so let’s talk about the maturity or of your organization right so just because you’re turning on fabric doesn’t necessarily mean your organization is immature in powerbi right if your team is larger if you have rigor around using best practice analyzer on your semantic models or you’re looking to tune or optimize your models right that that that right there already tells me that your your company’s maturing right when you’re getting to the place where you’re optimizing models and making them more

40:34 optimizing models and making them more performant to to make sure that they’re not bigger what happen how how the progression typically goes here is users start with powerbi they build models at the pro level their models get larger they absorb more data at some point they outgrow the pro licensing and they move into potentially premium per user licensing which gives you much larger semantic models at some point you run into problems the model runs slow the visuals aren’t working right and you need to start analyzing what’s going on

41:04 need to start analyzing what’s going on in the model your model design now needs to get better so this is my progression too right I started making crappy models at the beginning just because it it worked and I okay it works let’s move on I got more mature and then as I developed better models I got better at making those models more efficient okay so where does centic link Labs fit here now right if your company is in a place where your organization is mature around optimizing tuning building best practices in your models in your organization you may want to regularly review those things if you have a larger

41:35 review those things if you have a larger team building models there’s going to be people regularly reviewing or should be

41:38 people regularly reviewing or should be regularly reviewing your models to make sure you’re not building stuff that’s going to abuse Your Capacity so that being said when you go over to fabric if your organization is at that level of maturity where you’re tuning driving for optimization cemented link Labs will be a great free tool and again I’m like free to help you automate more of that here’s like let me give you an example right I have a deployment process we’re going to deploy some kind process we’re going to deploy some reports part of your deployment

42:08 of reports part of your deployment process will likely be running best business practice analyzer on your model in either test or maybe you run it in all your production models once a month something along those lines right that that’s the stuff you’re going to need to do when you when your models get really large in size you’re going to have lots of partition management to handle you’re going to need other tooling to help you adjust manipulate and then as your models get larger you’re not going to want to just flat load all the tables sorry another thought here yeah users who

42:38 another thought here yeah users who start with powerbi they typically load the entire table all the time like when you start just load the whole table well eventually you learn that well not all data is changing from The Source system maybe I should be smarter about how I incrementally load the data and as your data grows in volume you also don’t want to loow load like a 100 million records every night that’s inefficient it’s just wasteful at that point so then you start thinking about okay how can I in the best way best way possible only load the data that I care about only load the data that’s changing

43:10 about only load the data that’s changing right now you have to do more things around partition management incremental refresh yeah but you’ve got to think about that that’s this is where the heavy automation of centic Link Labs supports this no so it supports it but man so that is actually a harder sell for me if I’m going to do the from the partition management because odds are I’ve been doing this with a UI whether it’s actually an SS SMS or tabl editor 3 for a long time if that’s the case I’m sure it’s faster in cementing link

43:41 sure it’s faster in cementing link Labs but again to sell someone who has to then ramp up or create all these notebooks is like okay like where’s the the value ad for that case I to me I’m thinking really semantic link labs is not for everyone like every single powerbi developer or heavy powerbi developer is going to want to get their Hands-On cementing Labs if anything it’s probably one or two people creating them that are going to help manage a lot of

44:12 that are going to help manage a lot of other things I I think I think most of what I’m seeing around semantic link Labs is is around like so Greg asks a great question in the chat here he says do you start like are you at the place right now where you’re going to recommend or start recommending building Green Field brand new semantic models using only semantic link Labs is that what is that what you’re saying let me just say it’s possible to do that but I would also

44:43 would also I would also argue that you may not want to start with that like it it’s bringing building brand new models probably not the best opportunity for this one it looks like a lot of the features around centic link labs are focusing at least initially on debugging best business practice analyzer incremental refresh partition pieces there’s a whole bunch of powerbi API stuff that you need to do that as an admin or a expert you’re going to need

45:13 admin or a expert you’re going to need to use period and you don’t have to use Powershell now you don’t have to use other tools yeah it just handles all the authentication for you so a lot of these other things that are used it just makes everything so much easier so I I guess in that regard no so what so my question was about the people using cementing Labs I think there’s the misconception is it’s not going to be for every powerbi developer not every single heavy

45:44 developer not every single heavy Enterprise developer is going to be needing to use cementing labs to your point there’s probably an admin or someone setting it up for the on behalf of the team or on behalf of a department right because is everyone going in to utilize Sy Labs probably not but some if you’re a professional yes you are right if you how many how many powerbi professionals should be using verac analyzer are how many all of them yes if

46:16 analyzer are how many all of them yes if you’re doing any if you’re professional in powerbi analyzing every single one of you should be using vert pack analyzer it tells you how many columns there are what’s the size of those columns which columns are most wasteful or not if you are any professional everyone should be that’s already built into into Semitic link Labs so right there it’s a single function that says Senpai Labs vertac analyzer send it the data set boom done verac answer comes back I don’t need a third party to connect to a tool I don’t

46:46 third party to connect to a tool I don’t need to do other stuff it’s just literally right there boom done so like I think I I disagree with your statement should any powerbi professional be using sub yeah they all should be using it if you’re professional if you if what you’re doing around semantic models and you have access to fabric 100% you should be learning about and using different modules from semantic link Labs it will make you more efficient yeah you should absolutely be using that but here’s the thing if every single individual is creating their own notebook right right and then just kind

47:17 notebook right right and then just kind notebook right right and then just from a wild West Point of View like of from a wild West Point of View like the run the the BPA so are we pushing the data somewhere or do we have just a thousand notebooks right now like no there’s again this if you’re if you’re a professional developer I’m already assuming you’re going to be starting to pull some of the stuff together and centralize it right the same way that you’re centralizing like lake houses and notebooks already in your data engineering right you’re going to you’re going to start making workspaces your professionals should all be working together and hopefully people are sharing notebooks between each other as well

47:47 well too and I think that’s I think there’s a lot of this where I think from an overlap point of view where like that BPA there’s even a run bulk BPA why can’t can’t like like hey I’m going to be the one creating these notebooks on behalf of the team that just run so I’m not saying you’re not doing that that’s that’s not what you’re describing has nothing to do with using semantic link laps that is all to do with your company’s internal process do you have people smart enough to build things that everyone can reuse that’s your process

48:19 everyone can reuse that’s your process that has nothing to do with the centic link lap itself so and I think I think you’re arguing for something that is just a totally different scenario like you’re arguing for does your company have process no so here’s my worry all the things in here are great for the individual you take 10 people let’s take 10 people that are now all running submitting Labs on the wild right like some of these are really powerful we’re

48:49 some of these are really powerful we’re still dealing with apis and this is the argument here you’re still dealing with the like the apis that require some permissions that can do a lot here especially from an automation point of view if you are not careful everyone’s running through this what are you so worried about I don’t understand I don’t I do not I literally do not understand your question here like how yes there are some modifying things but if you if you don’t have access to the workspace or don’t have access to the items in that workspace you can’t touch them you can’t modify

49:20 you can’t touch them you can’t modify things you don’t have access to so it’s not like it’s it’s not like this is ripping across parts of workspaces that are giving you more access to things you couldn’t have before like so service principle that’s a moot point like it’s not g that’s no point in my mind now what I will argue though is you’re you are right but once someone has figured out a pattern here these patterns get set they just run and you just move on like right for example like you’re talking about running verac analyzer on all your models maybe that’s something you decide is a tool or an automation

49:52 you decide is a tool or an automation you want to run every night this so yeah load your models at the end of loading your model maybe like in the morning or maybe even in the evening run verac analyzer on all your models and store the data of the lake house that’s this is the point someone says I’m going to do this these are the workspaces that are important to me and they just build the Automation and walk away now the notebook can be scheduled with a pipeline now you can have all this extra data coming off of your semantic models the metadata stuff and you can just capture that all away into Lakehouse and this provides one huge huge layer of

50:24 this provides one huge huge layer of automation going back to your organization so now you can say we’ve run BPA on all our models we’ve run vertec analyzer on all our models and by the way we’re already funneling all the data to a Lakehouse in a common data structure format and now you can be reporting on that and so now instead of having everyone going in and finding problems with models you can be more proactive hey this column is really small it dropped a lot one night what happened there was a data load issue okay now we can go look at it right the this model is getting extremely large in size for some reason it’s been rapidly

50:54 size for some reason it’s been rapidly growing over the last couple weeks what’s going on what’s happening did something change these are all the signals and things that you’re going to want to administrate and monitor as your professional and in the powerbi and fabric space this tool just gives you the capability to do this at a much higher higher degree and I completely agree with that honestly a lot of the things too what I would love to see is someone who’s a python Champion or who already has that experience a lot of these could be made into not I I don’t want to use the word

51:25 into not I I don’t want to use the word templates but for anyone who’s using powerbi it’s like hey you want to view your measure dependencies I’ve created that notebook for the team right all you have to do is just plug in your workspace plug in your semantic model and it will run rather than taking someone who has to try to in a sense install create their own notebook a lot of these could be not even from a governance or security point of view from just an an adoption point of view so I believe Cen L Labs is in preview or not in preview

51:56 is in preview or not in preview it’s it’s neither it’s just a glass it’s just a project that Microsoft has built and supports it so it’s a it’s a wheel

52:04 and supports it so it’s a it’s a wheel it’s a package that you import or you install into your notebooks okay so my understanding is semantic link is by default installed with all your python so that’s already part of the image of all your of your all your items the semantic link Labs is not installed by default so you have to install this package and into the notebook you can even set up a custom environment in spark that says hey we’re going to just use this I will like argue though I have tested semantic Link in both Python and

52:36 tested semantic Link in both Python and in fabric notebooks so when I’ve done that it works really well for both of them so the python notebook only and the spark notebooks in Python py spark cemented link labs and semantic link both seem to work in both of them very well which is awesome that makes everything incredibly easy for me as well so as we get near the end and I think we’ve talked about from the user point of view the capabilities I don’t

53:07 of view the capabilities I don’t think either of us are going to argue in terms of how much better this is than Powershell or even some of the the old old Solutions Mike if if I if you took a team today and there are you implementing cementing link Labs across the board and what would be the top three things you’re going to migrate from the old apis or xmla to your point you talked about we talked about incremental refresh we’ve talked about the activity log what would be the top

53:38 the activity log what would be the top three things that you think cementing link Labs could automate or from a priority point of view yeah if if I’m looking at the things that are the best opportunities in semantic link Labs it’s all around the vertac analyzer so that’s really a good useful tool I’m also thinking that it is a lot of the automation things around getting data out verac analyzers there breast bu practice those are available to us doing things in bulk there super useful I’m finding a lot of value in complex refresh patterns inside semantic

54:10 complex refresh patterns inside semantic models so which which partition to refresh when your models get really large I like using centic link to automate more of the model experiences and automate more of the how how to very strategically refresh these models typically this is all done in like Powershell or other scripting things if I can just so think of it this way I have a pipeline I’m going to load some data into a table that data is now being loaded into semantic models but if that

54:41 loaded into semantic models but if that data is really large if we talking about Big Data stuff you don’t want to refresh the whole thing it it’s just too much so what you can do now in the pipeline is run your loading process copy the data and then at the end of it you can execute a notebook that then does the fancy refreshing policy that you may need again you’re you’re now at the place where you’re optimizing your model you’re bringing cost down because I don’t want to refresh the entire model all the time there’s a whole bunch of automation things here that I think really make this incredibly powerful so

55:13 really make this incredibly powerful so it’s it’s to me if you’re an admin of anything powerbi 100% you better be learning CTIC link Labs it will make your life 100% easier it will simplify a lot of the mundane tasks you’ve already been doing and I think Jake in in the comments here is making a really good point this semantic link Labs is like that stick sometimes that you need to beat the developers over the head with hey you’re not doing this right right either either you you automate this and you start sucking out the data in mass and then you can now go

55:43 the data in mass and then you can now go back and have the conversation with hey your model doesn’t conform to our best business practice analyzer hey your model is getting too large hey these are proactive things you could use that’s going to help you make everything easier to work with in powerbi or even fabric for that for that matter anyway yeah no I I love it I think for me the the best practice analyzer obviously is a gold standard and be able to push that data in I was trying to look does this can you connect and actually do the activity log

56:13 connect and actually do the activity log with the smiting Labs or no yes yeah activity log you can do activities based on a workspace you can do activ you can do the artifacts there’s a cat there’s a a cataloging API that it hits as well so it could also do again these are API calls that would have been three or four calls in a row right to get the data out what this thing does is it just Waits so to do like the cataloging calls of a workspace geez you could I’m just telling you like this this is my point Tommy like the the the fact that this

56:43 Tommy like the the the fact that this tool has so many functions in here like it does so many things it’s just incredible how capable this is and this is where I’m like I don’t okay fine it may be a little bit harder to do some of these things that I used to do in tab your editor or Dack Studio maybe right but now that I have all these really rich things in a notebook why would I want to use other tools instead now I can just use this notebook to automate

57:13 can just use this notebook to automate things like let’s let me give you another example Dax Studio why do I use Dax Studio I use Dax Studio to write and execute Dax queries well I can write them now in a notebook I can have multiple cells to me the experience of writing multiple deck statements inside a notebook is actually much better honestly another reason you well I honestly another reason you well think of it right if I have a cell mean think of it right if I have a cell and I need to copy the cell again multiple times I can just take my Dax code and write a different evaluate statement and I can see the results of the prior statement and the new one right side by side I would prefer to

57:43 right side by side I would prefer to write dacks in notebooks honestly so that that’s bet to me that’s a better experience I want to track all my changes in different cells and compare them love that experience so you can also do timings with things so if you Dax Studio does a good job of timing how long a query will run and we all know you can write Dax multiple different ways why not write a notebook that accepts a Dax statement and then runs all the tests you need to do right now in tab in Dax Studio there’s no automation around okay I’m

58:14 there’s no automation around okay I’m going to run this Dax query three times clear the cache each time all that can now be automated in a notebook so now you can do DAC Studio level things with Automation and repeatedly doing automated runs on things without having without requiring Dex dud anymore like this is where I’m like your mind is now the limit of your creativity it’s not the tools anymore that then that’s may be my point Sorry I’m getting really excited about this one awesome well dude I I hope for a lot

58:46 one awesome well dude I I hope for a lot of people if you were hesitant on starting with notebook and starting with python you’re like I really don’t have a good reason this is probably the the most perfect reason to start learning and dive in I agree I regardless I think this is a a tool in Your Arsenal that you will need like it’s it’s definitely frir now has it caught up to all the other Rich tools that are out there Maybe not maybe not quite yet but based on the capabilities it can do now and the things that the semantic link

59:16 the things that the semantic link Labs can do that other tools cannot I I think I’m I have my I have my I I’m making a bet I think this tool is going to continue to get better and I think it’s going to continually close the gap more and more and more between all the other tools that you have out there and for the companies that complain like there’s various companies that complain like well we can’t use these other tools because it’s a third party tool and Microsoft should just support it and all this stuff now it’s supported right

59:46 all this stuff now it’s supported right a lot of the tools other tools that are out there you don’t need them now I don’t think you need to use ta editor I don’t think you need Dex studio anymore I think this closes the gap on a lot of those things so anyways that’s where I stand on this one awesome well this has been wonderful I hope people who are just diving into Fabric or just getting started and just to make sure for those who are still on powerbi premium this is a fabric only correct this is a fac only

60:16 fabric only correct this is a fac only feature so this is we’re talking notebooks centic link Labs only works in a notebook experience so you have to turn on fabric to get this working but you can go check it out in a trial you can get them Microsoft fabric trial on your users if that is allowed by your admin so you can go check things out this is a great place to get started I think this makes a lot of the admin stuff or API pieces that you normally couldn’t touch it makes it way easier way easier to get into touching these apis for both Fabric and powerbi the authentication stuff is just handled so I’m I’m super pro

60:48 is just handled so I’m I’m super pro this tool is amazing I should probably do an entire YouTube series on different patterns that this thing can do and how amazing it is because I think there’s there’s there’s a number of fundamental features that have come out for powerbi in this year this is one of these fundamental features that people will need to learn it will change how you build stuff moving forward it will make you more efficient so I you’ve got to learn it I think it’s just a yeah it’s a it’s a given anyways any other final thoughts Tommy for you no I think this is perfect

61:20 Tommy for you no I think this is perfect I think for a lot of people who are not sure how to get started this is what a more what a better or what a perfect way to get started with all the things that you’re already doing from an automation point of view and just getting started with python notebooks excellent with that we just want to say thank you very much for all the time you spent on the podcast today I thank you for just checking out us and looking around semantic link Labs I think you’re going to love it it’s a great feature if you like this discussion if you didn’t know about semantic link labs and you want others to see the surface area of what

61:51 to see the surface area of what surf link semantic link Labs can do please share the podcast other people I think it’s a great a great opportunity just to get yourself exposure to things you’re not used to so that being said we really really appreciate it if you would share the podcast with other people we don’t advertise we just do this for fun and if you found value from this we’d love it if you’d share it to somebody else Tom me where else can you find the podcast you can find us on Apple Spotify wherever your podcast make sure to subscribe and leave a rating it helps us out a ton share with a friends since we

62:21 out a ton share with a friends since we do this for free you have a question idea or a topic that you want us to talk about a future episode head over to powerbi. com

Thank You

Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.

Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.

Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.

Previous

C# Scripting in TE vs TMDL View – Ep. 396

More Posts

Mar 7, 2026

Is Power BI Desktop a Dev Tool? – Ep. 376

Mike and Tommy debate whether Power BI Desktop should be treated like a true development tool or more like a report authoring environment. They break down what “developer workflow” actually means for Power BI teams—source control, testing, deployment, and repeatability.

Mar 4, 2026

AI-Assisted TMDL Workflow & Hot Reload – Ep. 507

Mike and Tommy explore AI-assisted TMDL workflows and the hot reload experience for faster Power BI development. They also cover the new programmatic Power Query API and the GA release of the input slicer.

Feb 27, 2026

Filter Overload – Ep. 506

Mike and Tommy dive into the February 2026 feature updates for Power BI and Fabric, with a deep focus on the new input slicer going GA and what it means for report filtering. The conversation gets into filter overload — when too many slicers and options hurt more than they help.