Microsoft Ignite Updates – Ep. 268
In Ep. 268, Mike, Seth, and Tommy do a fast, practical recap of what came out of Microsoft Ignite and the November 2023 update cycle for Microsoft Fabric and Power BI.
The point isn’t to read release notes on-air—it’s to translate the release wave into action: what’s worth piloting now, what needs more maturity, and what questions admins and platform owners should be asking before turning on the new shiny things.
News & Announcements
- Microsoft Fabric November 2023 update — A clear snapshot of what’s new in Fabric this month (and a good checklist for what to test in a sandbox before broad rollout).
- Empower Power BI users with Microsoft Fabric and Copilot — A useful lens on where Copilot can accelerate authoring and analysis—plus the prerequisites that make it trustworthy (model quality, security, and governance).
- Power BI November 2023 feature summary — The monthly Power BI roundup for authors and admins; scan this first to understand what changed for your end users.
- Submit a topic idea (Explicit Measures Podcast) — Send the crew a real-world problem and they’ll unpack tradeoffs and patterns on a future episode.
- PowerBI.tips Podcast — Subscribe and browse the full back catalog of Explicit Measures episodes.
- Power BI Theme Generator (Tips+) — Create and maintain consistent report theming fast (colors, typography, and layout style).
Main Discussion
Microsoft Ignite announcements and monthly feature drops can feel like a firehose—especially when you’re already juggling support requests, roadmap work, and governance. The crew’s through-line in this episode is a simple operating model: treat updates like intake, not trivia.
Instead of asking “what’s new?”, ask “what changes how we work?” The best updates are the ones that either (1) remove friction from day-to-day delivery or (2) unlock a new pattern you can standardize across the platform.
Key takeaways:
-
Build a monthly evaluation rhythm. One owner skims the release notes, pulls 3–5 candidates, and schedules a short test window.
-
Be explicit about maturity (GA vs. preview). If it’s preview, decide whether it’s a controlled pilot or a hard ‘not yet’.
-
Copilot is only as good as the semantic model behind it. Strong definitions, clean relationships, and governed measures turn Copilot from a parlor trick into a real accelerator.
-
Platform convergence means standards matter more. The closer Fabric and Power BI get, the more you benefit from repeatable patterns (naming, workspaces, deployment, and monitoring).
-
Admin settings and capacity planning are part of feature adoption. A feature that looks great in a demo can create tenant-wide noise if you don’t plan for security, refresh, and usage impact.
-
Write down ‘what changed’ for your users. A tiny internal changelog (what’s new + who it affects + how to try it) beats surprise every time.
Looking Forward
Run a 30-minute monthly release review: pick one Fabric item and one Power BI item to test, then publish a simple go/no-go recommendation to the team.
Episode Transcript
0:31 welcome back everyone to the M podcast with Tommy Seth and Mike hello everyone Mike did I morning gentlemen well today’s Major topic is going to be around let’s talk about what happened at Microsoft ignite a lot of announcements have been made we got new visuals we got new features there’s things called co-pilot all over the place we got to figure out what’s this mean to us now so there’s a lot of things that seem to have changed here recently looking forward to discussing what we think are impactful what we
1:01 what we think are impactful what we really are excited about and things that are just H maybe maybe we maybe we are okay with so we’ll give you a real opinion around all the new features coming out and I guess we’ll have to dive into it a little bit but before we do that let’s do some news any news articles here any anything that’s worthy of talking about other than than ignite probably semantic models are our thing now well we’re going to have a giveaway oh yes would you like me to go Tommy it up oh man so excited to say
1:31 Tommy it up oh man so excited to say this so we’ve done this once in the podcast of Life Time and we’re going to do it again what a perfect time to do so we’re doing away two promotions correct or just one right now just do one for now or two whatever I don’t care whatever sure we’ll do two why not so if you love powerbi do tips if you love themes and you love p the explicit measures podcast we’re actually doing an honor of Microsoft fabric becoming generally available as of yesterday which is incredible we’re giving away 10
2:01 which is incredible we’re giving away 10 subscriptions for two weeks on powerbi tips plus powerbi tips plus is for the theme generator and let me put it this way guys Mike obviously I work with Mike I help test power the themes generator if Mike said Mike or Tommy you’re gonna have to pay for it my next sentence would be here’s my card info so it’s very reason price to begin with but we want to get people started a bit more so so H yeah so how do you actually enter into this drawing well go on LinkedIn and Twitter and tell others and
2:31 LinkedIn and Twitter and tell others and us why you like the podcast with the hashtag explicit measures podcast and you will automatically be draw entered into the drawing this starts now until the end of the month so as many times as you want to post you’re enter to a drawing we’re going to do a random drawing and 10 giveaways that’s the first promotion we’re doing the second one is also exciting there’s a conference going on right now what a perfect time some of you may actually be there there’s a lot more conferences going on well and this this is a brain
3:03 going on well and this this is a brain chle mic I love this go ahead take a picture of yourself at a conference with the hashtag exlusive metes podcast saying you’re at the conference you’re doing great that’s one type of giveaway Mike tell me about the other one yeah if you just show up at conference be at a sign just just share some love there take a picture of yourself make sure that you promotee that you like the explicit meure podcast while you’re there we’re happy to promote the event as well as the podcast as well so we’ll give you a week free of tips plus if you post that also yeah if you can find a
3:34 post that also yeah if you can find a program manager from Microsoft will’ll give you two weeks of tips Plus for free program manager to take a picture with to take a picture yeah take them it’s not just a selfie it’s a it’s a picture with a you have to have a picture you have to have a picture with a manager at anyone of the product managers with Microsoft make sure you tag I love Microsoft and I love the explicit measures podcast we’re looking for the explicit measures podcast tag that will
4:04 explicit measures podcast tag that will also get you in the running for some free tips plus subscriptions as well so we we appreciate the community the community is awesome a lot of fun we felt this would be like a a really interesting game to play here and see who wants to jump in and join the explicit meters Community a little bit more here as we try to continue talk about these great tools that changes all of our careers with it’s changed our careers we’re doing this stuff fulltime now so we hope you’re enjoying it as much as we are yeah and just to reiterate Mike’s Point the rules of the road here the promotion starts today goes through the end of the month in
4:35 goes through the end of the month in order to be entered into either drawing you must have the hasht explicit measured podcast yes awesome with that let’s jump into our main topic for today ignite has happened what is ignite Tommy give us the for those who are not following along the news here who maybe haven’t heard what’s going on I can’t assume that everyone’s jumping in on this what’s the the runaround here and what what is ignite I would say Microsoft ignite is the second or third largest conference that Microsoft puts on every year it’s probably the Microsoft build which
5:05 probably the Microsoft build which happens in the spring and ignite is their fall very developer focused if you watched the kot yesterday about chips and computes but it’s it’s a little more developer focused it’s all about the developers all about trying to build a better UI experience for them better technology for them and obviously there’s a lot of correlation to what we do so that’s it believe it or not it felt like this conference even though it was very it’s very heavy heavy
5:35 it was very it’s very heavy heavy developer focused I don’t feel like the things that I’ve been hearing so far have been super developer intense at this point it’s been interesting to see what’s going on there there’s been a couple sessions that I think are a bit more what I would say in the weeds more technical in nature but for whatever reason this ignite has felt much more marketing and look at us we’re presenting a lot of publicly facing things now that more so than in the past that’s just maybe my perception they had a three 3D rack during the the main Keno by SAA it was a giant
6:06 main Keno by SAA it was a giant compute rack that was all virtual showing like how the virtual they have I think what the the highest powered virtual machine publicly available in the world yeah I saw s that was part of his main announcement in the the keynote I guess of the of the event excellent so where do we want to take it from here what do we want to talk about so there’s a lot of announcements have been made around specifically fabric I guess maybe the first one I’d like to to touch on here is if you were using fabric on
6:36 is if you were using fabric on Tuesday you were using Fabric in a non-production environment but by the way Wednesday you can use it in production so I think Wednesday they announced the initial availability of fabric is now generally available it is now ready for production workloads you can start using fabric for your production environments you expecting were you expecting that to happen this early I I thought it would be 2024 that fabric would be all GA but that’s what it is today it’s interesting that you
7:07 it is today it’s interesting that you mention that because it it’s there’s portions of fabric that are now GA and there’s portions that are still not GA it’s like still preview so like it’s I think they got enough done to say yes it’s GA it’s it’s going well I also heard from one of the the I think was in the keynote they have 25 I think a rune was was making some comments on a little video somewhere in the middle of of ignite they have 25, 000 customers not people using it 25, 000
7:37 customers not people using it 25, 000 customers and a large majority of the fortune Fortune 500 yeah are already using fabric at some level capacity and even that even though they’re using all this capacity and they’re jumping in with it there’s already three different workloads that that people are using inside those fabric environments so I don’t know if this is a testament to if you give something away for free people will use it or it was really that game changer thing that people really wanted to get in and manipulate and play with
8:07 to get in and manipulate and play with so it’ll be interesting to see how that adoption continues to move a lot of this has been under the trial experience I know I have my or my I have two organizations both of them have have have workloads running inside fabric because it was free it’s easy just click a button I get an f64 I’m getting free premium essentially P1 capabilities out of the box right away so I was loving it trying to experiment with things figure out what workloads I can put there U and it’s as someone
8:39 can put there U and it’s as someone who’s more I would call myself more of a data engineer than I would a data modeler a developer of reports because I I think more of my skills aligned to that area because I’ve been doing more of the backend engineering of things as I’ve been moving more workloads over to fabric there’s a lot of solid aspects of being able to get data in and get get it into reports makes it pretty easy with a couple clicks the the free thing is is a large Point here and obviously we know that we’re doing a promotion for free stuff but more
9:10 stuff but more importantly this doesn’t work if they don’t give it away for free and I I truly believe that if if they didn’t give away fabric for free that beginning of the trial period doing the preview I don’t know how well fabric works it’s the same strategy they do with powerbi today yes corre desktop I was doing a dashboard day yesterday I’m like yeah for those who are old who had ssas and you had licensing you get that now for free just download and I I think that’s a beautiful thing about yeah you’re testing fabric of course it’s a little
9:40 testing fabric of course it’s a little buggy or was or now this GA expecting those bugs to go away but but but more important it’s like oh I see how I can now fit this in my own workloads and as a Consultants who’s like oh I see how this can work for all these other scenarios that doesn’t happen unless they how much money they spent not just to develop it but then to allow people run these Compu workloads these machines these engines so yeah when we transition to from
10:10 when we transition to from really lowlevel SKS to allow people to do 25% of something it was really hard to do like get a full idea of the POC without actually spooling up things so like transitioning to yeah you have like everything you would need to understand whether or not this is going to work for you the only thing you may not have is like this thing called sharing or whatever the case may be from like powerbi side but now like going through these it from a a deployment standpoint obviously
10:43 from a a deployment standpoint obviously there’s been a large shift to let people kick the tires because there’s larger adop adoption of these tool sets because people get to work with them understand fully what they can and can’t do for their work streams and then and then light them up hopefully with with with GA I think the next Big Ticket item here is if we were all playing when is co-pilot coming powerbi we all lost that because it’s now that’s the next major announcement where
11:15 that’s the next major announcement where co-pilot is a public preview of co-pilot in Microsoft fabric for report building is one of the major parts where it’s like hey let’s let’s build I need a summary of this they’ve rebranded a few things which makes perfect sense the smart narrative feature is now co-pilot coil yeah which just logically makes sense I think they’re building it in the Q& A feature as well which again makes perfect sense is the next logical step and that’s just not not
11:45 logical step and that’s just not not long ago right like where it it it’s going to rec recommend synonyms like the different things that you would need in your model and the complexity around all that so it’s cool that they they plugged it into that that specific spot report cre cre Dax writing experience I think there’s another one Mike I saw one I saw a demo of them doing some our query as well there’s a there was a demo of power query having co-pilot as well hey I want to take this column or find the the the distance and time between
12:16 the the distance and time between this start date and the end date you this start date and the end date what’s what’s the range and it was know what’s what’s the range and it was able to pick up columns that had date things date start date end and was able to get a duration between them and start writing some of your M code for you which I think will again I think people eventually start writing their own code because they start seeing it enough because you start learning it but you need something to initially write the code for you so you can learn as you go along I think that’s very important here is to make sure that this is how I learn a lot of my code pieces give me samples of code autogenerate
12:47 give me samples of code autogenerate some stuff for me let me see it being generated and then I can I can make the connection between what I asked for and what I’m getting which is super helpful and we know this can work like if this if this was a year ago go it’s not going to work write Dax but I did that experiment with that Cruiser AI tool and I fed it literally Dax do guide and sqlbi said here’s documentation I need a rolling 60-day measure and really no tweaks and I told her what my model looked like I I had to Define it obviously it looked like PS code came
13:17 obviously it looked like PS code came out pretty darn well so we know this can work there’s a few gripes here I have personally the the synonym featur is coming out November 2023 but a lot of the features of co-pilot in Microsoft fabric I think with notebooks probably with data flows power query that’s probably gonna happen by the end of March 2024 so that’s the that’s the one gripe I have like great feature I’m glad you’re announcing it we’re still five
13:47 you’re announcing it we’re still five months away four months away I’m also very aware that a lot of these co-pilot level features this is they’re it’s expensive to run them right another announcements that that correspond with aot lot of this copil integration with RBI is Microsoft has I’ve I’ve been talking about this a lot is first thing you do is you build a product right so you build chat TPT or you build a product that people want to use you start incorporating you you gain adoption so you put it in bing you put it in your your edge browser you put it in all over the place where people can easily use it and say wow
14:18 people can easily use it and say wow this thing actually adds value to my life I like to use it what’s drives adoption and then the third phase of this is once you have a user base you’ve made it you have a user base the third step is to optimize what you’re building and there is a ton of announcements around Microsoft’s making this chip they’re building data centers more effectively these these chips and computes are very expensive in a heat and energy usage standpoint so they’re doing a lot of work around how do we make this chat GPT experience as
14:48 make this chat GPT experience as efficient as possible where you can run these models I was talking with a data scientist recently and he was saying a chat GPT model running on a graphics processor unit eats an inordinate amount of Graphics processing for a long period of time so it just it’s literally making Max compute Max wated usage for a period of time I can’t remember what he said but as long as the thing is prompting it’s like pegged out it pegs out and it’s like full usage of whatever that GPU is doing so in order to serve millions of
15:20 doing so in order to serve millions of requests back through these things you had to have infrastructure that supports it and it’s got to be efficient you’ve got to figure out faster ways of using the prompting things so it’ll be interesting to see where co-pilot actually lands as far as does it come in Pro does it come in premium per user can you get it in an fscq which FCS will it start applying to so I think there’s going to be thresholds here for organizations some will be able to get it earlier than others because they have a larger amount of spend inside the
15:51 a larger amount of spend inside the powerbi ecosystem with all the co-pilot and as stuff did they did they spend any time in any of the presentations talking about the the data itself like it is so with when you’re spooling up llms or whatever like security is a big thing right even in even in the preview thing is it is it what I would expect where it’s locked down into your tenant it’s only looking at your data your data is not getting sent anywhere or is that did they address that at all yes yeah they
16:22 they address that at all yes yeah they said a lot of these things are these are your data again this is this has been a newsworthy article a lot and Nvidia came on stage and talked a lot about how their Graphics processors and that combined with Microsoft infrastructure is really going to help you build models that live only with your data right the queries that you’re you’re running against it it will answer them but your data is not being used to go back into tree train models right so if you send it code the idea is it will use that code it will then do some
16:52 code it will then do some large language model elements on top of it but it’s not going to then go back into the the fun of okay now it’s the large pool of all large language models and we’re going to train on it now so yeah I think I think that’s a really important point co-pilot for code editing like yeah fine right I understand there like there is some potentially very proprietary code in that but if if co-pilot all of a sudden is now like hey generate a report for me you just open up an entire
17:24 report for me you just open up an entire data set or semantic model to that right so like not a data set don’t have those anymore are gtic model of data that’s that’s another thing that’s happening here is is the name of the models themselves have changed now we’re no longer talking data sets data sets have have gone by the wayside which I think is a very smart move by Microsoft honestly I I know for me like I I can’t use vs code now without co-pilot and I think that’s just gonna happen part of the workflow with fabric I
17:54 part of the workflow with fabric I think another feature that got swept under the rug was not one of the major announcements at Le at least in the first kyot but powerbi semantic model support for direct Lake expands from just lake houses but also the data warehouse that’s pretty big oh I think so so too so I just give a background here that direct Lake mode in importing dat or it’s not importing data but connecting to data and power bi semantic model just before only
18:25 semantic model just before only supported lak houses in in Microsoft fabric it did not support if I wanted to create a warehouse if I want to create the SQL Warehouse that was going to be in Port mode that’s now also cap has the capability for direct Lake and again that is really that yeah yeah so that flies man that is that is one of my favorite features here because not every time I need to do a lak house I don’t maybe I don’t have files maybe I’m connecting to a a a SQL database right
18:56 connecting to a a a SQL database right true and I don’t have there’s a lot of scenarios I’ve run into like I could do a lake house but I would only do the lakeh house just for direct Lake mode yeah that that makes sense to me I’m just like there was some probably some Plumbing that had to be rewired fixed whatever but because we know like regardless if if you’re building things in the SE Warehouse or the lake house or whatever like those objects are the same right so to support one and not the other it’s yeah it’s
19:27 one and not the other it’s yeah it’s it’s awesome that awesome that that works in in all the the methods now for connection because I would I would imagine there’s a lot more sequel based Solutions than there are lak house with notebook yeah and it’s gonna take I was gon G to say it’s going to take a while for everything to migrate to if everything migrates to The Lakehouse SQL databases aren’t going anywhere at least now maybe maybe in the next 10 years but it’s going to take a while to migrate anything over
19:58 take a while to migrate anything over over so I am all for this feature another one I want to point out here so I want to we’re talking about a lot of little things here let’s keep going through some features there’s a ton of stuff that’s been announced yeah there’s new thing called real-time replication or mirroring oh yeah was wondering your thoughts so I’m not a squel traditional guy I’m I’m more of the warehouse modern person on this one this mirroring effect
20:21 this one this mirroring effect connecting directly to a database listening to the change data capture log taking all the information in it sounds like they’re trying to do keep these tables up to date as quickly as possible Seth what are your thoughts on this thing is this is this a useful feature or is this something that’s more nuanced for people who are using large databases and getting it into fabric yeah I I think I’m excited I’m excited about this one because in some ways it it seems like the shortcut within fabric right like I have an object I’m not moving it anywhere you
20:51 an object I’m not moving it anywhere you you can just access it and this is now something that is on the data source side right so preg getting into into Fabric or pre not pre like connecting to a data source and not having to move data that’s important I is a huge thing and I think it also solves like a lot of the larger problem where you would have it okay well I’m I’m sold on fabric I I want to I want to do more with it but I have to plan a migration
21:22 with it but I have to plan a migration of like moving all of my data and the structures into this thing and and with mirroring based on your description to me cuz I’m the hot take guy you guys you guys got to have all the fun the fun yesterday expend absorb this information firsthand but ultimately like to me that that’s going to lower the bar for people to push or or use fabric much faster and and not have to worry about redesigning data source
21:53 worry about redesigning data source systems that may already have a lot of business logic and ETL or elt built in them it sounds like there’s this need for virtualizing data that’s becoming very important right create your data move it once let it stay there don’t move it again this is going to I think the more that Microsoft can lean into this this seems like another strategic move for them for why the fabric is there what’s inside the fabric experience all the Lakehouse elements they building another feature
22:23 elements they building another feature that Dove tailes along with us about the idea of creating your data and then letting it live there there was another what’s the other feature here there’s a feature that they were talking about I don’t know if the name of it off top of my head there’s a feature where you can take an imported data set so you have a data set or semantic model yeah that is imported you can take that imported model and you can push the tables back down to Lakehouse so this was a very someone in
22:55 Lakehouse so this was a very someone in one of my classes around har I pointed out hey I saw this feature on the thing can you talk more about I’m like I don’t know about this one yet this is recently announced I didn’t really hear about this one coming down the pipeline very interesting because this is allowing you to take any existing semantic model and query the tables as as if they were coming directly from the lake which is very interesting I want to see this feature in action I want to look at this one a little bit more in detail because I want to focus on if I take an already importing model and push it down to my Lake what happens to those tables when I
23:26 Lake what happens to those tables when I refresh the data set does it Auto autoally keep those things in syn synchronized for me or not I don’t know yet but this is a really this this is going to make it very easy for us to migrate all potentially existing workloads workloads into other fabric elements and if I think about it here as well you already have inside a semantic model you have M you have tables and you have analysis Services all built into the harb report yeah there’s no reason why you couldn’t just rip out why why don’t we just move the M over to data
23:56 don’t we just move the M over to data flows Gen 2 why don’t we just move those tables down to lake house there should be some let’s call it migration tools right if you’re ready to go to fabric this is what I would call the proper Enterprise story that we’ve been looking for build everything in desktop get a good appr proof of concept rocking and rolling oh hit hit drop it down to Lake and it automatically builds here’s the data flow that supports your tables that supports your report all the way through boom done that would be really amazing
24:26 boom done that would be really amazing if we can get to that point where we could really say let’s take a development exercise around a report and move it into an Enterprise grade type solution yeah what’s what’s nuts with a mirroring as I’m like scrolling through here is just like it’s not just the source right it’s the ability to join into the data sets in an efficient manner yeah that’s that’s that’s big to me this it hits one of these moments for me where I’m like people keep building really big tables
24:56 people keep building really big tables inside Excel to get access to all their data not sure that’s the best solution anymore maybe we get them other tooling that lets them pull wide long tables of information out of it and don’t use like is there a more efficient way than to have this inside a cube maybe there is now like maybe we can make build other tools around this that are going to let people get access to data by dumping out large wide tables so this one feature alone tells me two very large things that are coming down is one we have no idea what what the direct lakeem is
25:26 idea what what the direct lakeem is really capable of and I think we’re just finding out what the technology is going to do the other side of this too is because they say like hey you’re going to be able to query using tsql python scale of py spark using this feature so I can actually query the u a powerbi semantic model from one a direct link with python and scale up this goes back to the powerbi semantic models no longer the end of the road of your analytic solution it’s just becoming a part of the path where it’s actually just a
25:57 the path where it’s actually just a another a PA it’s another road with a lot more directions yes and this is what the capability of is this and to me I I cannot understand how big this is for what we do we’ve worked in the last seven years eight years of Our Lives where powerbi models semantic models were the end of our analytical Solutions because whatever we transform power query it wasn’t going anywhere else that’s just now part of our the analytical journey and I so this these are underrated featur to I think where
26:28 are underrated featur to I think where we’re going from business intelligence I I can see an argument how that’s amazing and I can also see an argument where it it’s going to create even more even more problems go on well the the way I think about this is now like as I’m developing Solutions and and a semantic model to service up multiple reports or like a warehouse Right facts Dimensions Etc I’m a a data Mart I’m I’m
27:01 Dimensions Etc I’m a a data Mart I’m I’m building an object to serve out reporting that and logic that makes sense for a bunch of reporting if I then use that as a data source it’s not the end of the road right and I now have pieces that go into it and then further dependencies not just from a report side but dependencies that could could create problem s for me changing things in that semantic model in the future because now it’s a data
27:32 in the future because now it’s a data source for something else like and and ultimately it’s like is is this a fantastic play because like now you would absolutely need purview or something that would hook all of this together otherwise you would have absolutely no idea what’s using what possibly right so I’m not saying there’s not value there I just I I when we get into the like how I describe like data spiderwebs of I have no idea what’s happening anymore that’s a hindrance to change because you wouldn’t know if you
28:03 change because you wouldn’t know if you had to make a fundamental change what it would be breaking Downstream and that’s what scares me about it it’s just like I haven’t thought through like hey is this a fantastic thing I’m not as excited as you are right out of the gate because I I think that is going to be part of a a potential problem in the future that we have to deal with I think we got four episodes on that coming up probably because this goes back to like is it only certified data sets that go with is probably not every data set but this absolutely Tommy said it Tommy
28:34 absolutely Tommy said it Tommy everyone let let everyone know this is it I’ve been saying it for weeks now everyone I’ve been saying certified data sets we want to put our effort work certified data set yeah and Tommy it’s a certified semantic model isn’t it today at a. m. Tommy has made an submission that he’s he says certified data sets are probably where we should invest our time and we all agree I’m just teasing Tom yeah yeah yeah well first off is it a certified semantic model I don’t even know it’s
29:04 semantic model I don’t even know it’s not a certified data I am so it’s a certified semantic model let’s let’s add more words to the thing fun I know a certified data model semantic something like that but I regardless and yes I do I’ve never said I don’t like certified whatever I’m just totally giving you a hard time yeah yeah but this absolutely to your point Seth elevates I think the business user who’s creating these models and it’s C and it definitely elevates what the models cap should be where maybe it doesn’t become
29:34 should be where maybe it doesn’t become as much of a commodity anymore I don’t know we’ll see where that goes there’s a lot more conversation the Fantastic Parts about it are are if you really understand that anic model like oh my gosh plugging into something that’s highly curated into a form that you can like plug into a query or as a data source like that’s hugely valuable the other thing that is probably another whole episode is my gosh you guys like we absolutely need to be really clear about where where and
30:04 to be really clear about where where and how the Transformations and documentation and like now I have how many different places where things are are being modified changed Etc and and you would need to understand in order to use them as a data source because there’s so much business logic behind there it’ll be an interesting for I’m just like I I love the possibilities it’s just scares me well I’m hoping it’s not one of those
30:30 well I’m hoping it’s not one of those global toggles in the admin portal where it’s like hey for all your semantic models one Lake integration this should be something for certain security groups or workspaces has that have that access to work what else we got dude dude dude we’ve talked about this a bunch yes yes public preview of the data analysis expression our favorite Dax query view yes okay I was hoping we’re going to get to this one sooner than later yeah this one’s amazing the amount of times
31:02 one’s amazing the amount of times you’ve had issues or challenges with these things so it’s not about sell thank you very much for the sponsorship on this one yeah it’s been a great ignite launch and this is one of the main features that we want to talk about here we we’ll for sure Tommy will buy me a steak with that one because he talked about certified data models is not about to sell the MS KN SP liaison then may maybe Alex is doing that he’s our MS he’s who they brought over to wake up yes exactly so yeah this Dax query
31:34 yes exactly so yeah this Dax query view it it feels like we’re pulling some of the the features we typically had in Dex Studio or tabular editor into desktop now and I do know I’ve heard a lot of language from customers that say well if we need a thirdparty tool to do Dax tuning and development it should just be built into the tool totally understand but this is more I think the dax’s experience there is more around developers than it is new users and Builders so to me this is some more investment that Microsoft has been
32:04 more investment that Microsoft has been willing to make around the heavier development features I’m gonna say yes and no to that so I I part of my training is we do like Advanced D or intermediate Dax course and optimization and or two separate courses and we have to go to an external tool to do so we do the performance optim Optimizer we do and you could tell the fear and Dread that guts in people’s faces when you’re like what’s that DAC Studio it’s a studio I don’t want to do a studio like just the
32:34 don’t want to do a studio like just the idea of going to this different area and these These are features that are the the evaluate it’s not terribly difficult but now that it’s already pre-built there’s going to be a lot more adoption of using Dax queries which is such a good testing tool which is such a good not just for optimization but testing things out what are what is actually my number right what is my actual number something you can easily do in a SQL database but also I want to test something and I’m not going to blow up my model so I’m so happy to see this one for new users to get get started
33:07 one for new users to get get started with doing Dax queries but selfishly for me too because there’s a lot of use cases yeah I I I think although regardless you need some tooling around this it’s not perfect yet but I’m hoping they continue to invest in it I hope it continues to look more like Dax Studio hey I can write this query I can see it run I think there’s some even easy buttons to move between the performance analyzer and right into the Dax query editor so that’s going to be really helpful there I I there is there was
33:37 helpful there I I there is there was more than one beginner like there’s more than one Persona and Alex you’re saying exactly right right there is definitely more than one user using desktop there are beginner people that are just trying to get data in from a model and make a visual on a page but there also are Advanced people and a lot of organizations are using desktop as the way to build the semantic model that is is their definition they’re not willing to commit to a tabular editor and only build models using Code so we do need some happy medium here and having a desktop application that does
34:08 having a desktop application that does more of what we want I think is going to be extremely helpful I I 100% agree with Alex’s comment right not only personas but it like saying that the desktop should be way simpler out of the box and also have a developer mode toggle like dude I’ve been on that train for a long time right totally you have completely different sets of people that are engaging with this tool we we air a lot on the oh my gosh Uber developer mode like cicd and like Chang all this it’s it’s fantastic but yeah to the lay
34:40 it’s it’s fantastic but yeah to the lay user things like this are are the the the view that they should hit right and then as you get comfortable you get a lot of value out of the tool without hitting these really hard technical stops like Dax can do is is I think something that it’s just going to get more adoption of the tool and and more people engaged when you need them to be right because even when we talk about adoption and how many users within an organization it’s it’s a small percentage that are the people that are
35:11 percentage that are the people that are going to be engaging in building reporting but the easier you make it then you’re lowering the barrier and picking up a few more percent of people to to bring them along for the ride yeah I think it’s going to be a win I really like I’m really liking this feature already I think it’s going to be solid I’m going to in one General love using it I think this is going to be a new Staple in many people’s training organizations right so this you’ve gone through the beginner stuff you’re now on the intermediate level you’re starting to get a bit better this is the next logical step to get you deeper into that so I think this is a
35:41 deeper into that so I think this is a very good move for them to go this direction also with this right that pbap format it’s another they’ve been doing this for a while now it’s on preview right now but that’s another very developer Centric way of thinking about building reports and things so also very pleased to see that they’re adding more investment in those areas I don’t I know there’s a few other things on the the main announcement Mike do you have anything else there because we have literally I think it looks like 25 pages on just one the fabric November update it’s going to
36:11 the fabric November update it’s going to be overwhel I don’t know where where do you want to go I think we talk about new new slicers right so we have a we have a new button slicer which looks amazing by the way this is this is work that’s being done by Miguel Myers yeah I can’t tell you how clunky slice have been in the past and you can’t really change them or stylize them or tweak them very well already I’m seeing reports come out I think I saw someone in Jane park I think from Enterprise DNA just popped out a report and had like a Nike shoe look or
36:43 and had like a Nike shoe look or feel around a report he had all these cool shoes you just click on the shoe it had a little title next to it looked really really neat and it changed the report page so already people are starting to investigate and figuring out how these different slicers and things will work work here that’s going to be another one that’s very very relevant here and I’m very excited about that one to enhance your reporting just recently had a project where we needed to enhance the slicer or just have some additional elements here even the new kpi card pieces it’s a lot of easier to build these things now when you can
37:14 build these things now when you can actually add images and Graphics titles and subtitles like it really feels more like a robust experience now so I’m really excited about the new visual improvements the team has been doing a way better job not hiding the dog on SL the erase button for when you have a filter context just just keeping it there all the time and letting me see it is very helpful as well so very much improved on the new visuals yeah and the reference labels go along with that too right like all this is probably coming out of Miguel’s team right like all of
37:44 out of Miguel’s team right like all of these are these visualization type enhancements are are the Polish that people have been asking for for a long time yes and I’m happy to see our part of the the launch of all the things because these are going to be the big things that you start to see right out of the gate and the only way you could have produced these in in the past is by many many different visuals compiled on top of one another a very laborious process whereas now all this stuff is starting to get really simple really polished and
38:15 to get really simple really polished and I think to to the earlier point it’s just going to increase usage of the tooling this is this is one you were looking for Seth another feature here this is more the service based things roll up roll up Road level security and object level security and stored credentials now are using with direct Lake semantic models that was one major blocker you said I can’t use a direct Lake without having more of the RLS and Os supported through a direct Lake connectivity which is huge that’s massive it reduces the
38:46 is huge that’s massive it reduces the usability of that of that feature right you’re saying without the feature it reduces the usability but with it you can you can now have more use cases yeah absolutely I think that’s another really really interesting one one here that I want to point out here just you your feedback on the semantic model scale out was very interesting so it’s basically getting a lot of readon copies when I read this one initially I was like okay then I read a little bit further in here you can do refresh
39:17 further in here you can do refresh isolation you can have something refreshing in an isolated Manner and then quickly swap it out at the end more advanced refreshing scenarios you get increased query throughput because you can have multiple queries hitting multiple analysis services that are all copies of the same thing and if I read the article correctly this is all living on top of direct lake so and when you have direct Lake type things you can now have it makes sense right the the data is not going anywhere all you’re
39:48 data is not going anywhere all you’re doing is reading the data into the model and you now you have like not one analysis Services model you have three or four of them that can all read the same or load the same information this is huge I think this is going to be a really neat feature for companies that need to scale out large levels of queries across their models this will be really neat to see this one work I think I the I I could not agree with you more right the Fantastic part about an improvement like this is this this is
40:18 an improvement like this is this this is Microsoft making these things more efficient for you and less impact to the end user right which just allows us to utilize and I guess utilize the compute in a really efficient way for these these these models to not impact not models capacities to not and usage to
40:40 models capacities to not and usage to not impact the end user whether that’s a a balance between oh my gosh we’re now on fabric SK and we’re utilizing it for other things so we need to be more performing here I don’t care but that’s a yeah I agree it’s a fantastic Improvement to the end user experience and just being able to refresh in that manner and validate and and before it it gets replaced in the production version of things and but think about what’s happening here right we move from and we
41:11 we move from and we move from premium gen one to premium Gen 2 in premium Gen 2 we didn’t care about memory we cared about compute right so yeah when you when you care about the compute more than you care about the memory of how big the files are we have we’re always pre again this is like way back right when we were way back in the early days we could only have a 3 gigabyte model living on a machine that had 3 gigabytes of memory but now they basically have released that capability and now everything is focusing purely on how much compute do you need to run the thing so in this
41:43 you need to run the thing so in this scenario who cares how much memory these models are eating up the model could eat up an an ordinate amount of memory it’s only when it’s doing the querying or the compute capacity that’s the portion here that’s actually making that’s where your spend is technically coming from you’re buying dedicated cus and those cus are what what is costing you money to run the system and so that’s been like a really really interesting move that I think Microsoft is doing here and I’m seeing a lot of interesting new design patterns because
42:14 interesting new design patterns because of this this to me feels like one of those shift in design patterns so I’m going to shift gears quickly we touched on this but I really really may be this may be the thing I’m most excited about in the future and that’s that narrative visual co-pilot we I think our eighth episode ever we talked about does QA work with powerbi and we all came to the consensus not really it just it’s just not there it’s not something we utilize it’s
42:45 there it’s not something we utilize it’s not something we see clients or other organizations utilize but if you go to the powerb November feature summary blog which again also came out the same time the narrative visual co-pilot you actually see the demo of them walking through this this is I think the envisionment of what we’ve all had of what we needed I I Feel For the First Time with narrative visual with Q& A the most promising or the most positive outlook towards this becoming part of
43:16 outlook towards this becoming part of workflows part of major reports and and part of really what it should deliver it’s it’s more than just hey what are my top show me bu a bar chart right it’s the it was clunky before the narrative science to try to choose the metrics that you wanted part of that automated Insight they’ve never got it right it was always Off the Mark somehow look at the the visual side of it where I can say hey I’m G to tell me what’s going on but I’m G to have a check marks on what metrics and Fields I
43:48 check marks on what metrics and Fields I want you to do that context off of so I can almost model that visual based on context that I’m choosing which was never available before it provides some suggestions but I’m choosing what context I want it to provide based on the visual and also based on my model itself we’re getting to a point where this is I think going to be part of our training best practices to a certain degree I am personally really really excited about this yeah I’m I’m
44:20 really excited about this yeah I’m I’m going to say I’m not as thrilled about the feature I think as you are I don’t think I don’t get a lot of people asking me for hey could you just give me a large blob of text that I need to read around my report a couple bullet points might be relevant but another you might be relevant but another I think it’s going to be helpful I know I think it’s going to be helpful I think in some scenarios I definitely have been in situations where reports are required to have a summary of what’s going on on the page but I’ve used it with varying degrees of success in the past and so I think while this is definitely probably a large Leap Forward
44:51 definitely probably a large Leap Forward in improving what it’s going to get and how the information’s going to look I’m not sure sure if it’s going to be a staple I we’ll have to get this one some time I’m going to reserve my enthusiastic opinion around the co-pilot visual could be interesting now if I could use the co-pilot Visual and say as a bar chart I want to do these things and it will smartly grab the data fields and build me what I want for a single visual maybe I’m going to be a bit more using user friendly with that so we’ll
45:22 using user friendly with that so we’ll have to see where this one lands I’m not sure how functionally I’m going to use this visual to begin with so I’m I’m going to reserve a bit of my I have some reservations yet we’ll see we’ll see what again maybe this is because I’ve been tainted by the past experiences of AI based visuals 100% And I’m like is it really like is it going to tell is it going to wow me like you to tell is it going to wow me like co-pilot does Inside Edge is it know co-pilot does Inside Edge is it gonna is it going to provide that same level of wow if it does okay I could see people using
45:52 people using it where I see it being more impactful is like I would rather have a prompt window in the in the report as as opposed to having a window that says here’s the text of what the report’s doing I think better prompting to the actual report page would be very helpful that I think would be more useful to people here’s my prediction when why I think you guys should have a little sunnier Outlook towards this and no one is asked for it because it never worked when we’ve done our story framing we’ve understood the importance of text
46:22 we’ve understood the importance of text as well we’re actually getting something now or what they’re promoting here at least and I’ll say that because we know I I I’ve been on the same gated train as well my friends but the jaded train that’s good thank you thank you you’re welcome I’ll be here for the next 10 minutes but the last thing I’ll say because I know there’s like 80% more things to cover no and no one’s had it no one’s asked for it because it never worked if you actually had something to provide the summary text for adoption
46:53 provide the summary text for adoption too for the main metrics too this would absolutely be one of the number one requests give me something that works give me something for adoption so I can get the people who aren’t data Savvy I’m all on board I me let’s be real here if any of these features work exactly as we thought they would everyone would love every one of them it’s it’s the differ right it’s the difference between what is my expectation how easy to use this thing is and what it actually does like I can you’re trying to sift out some of the marketing jargon that’s going along with this like okay this is amazing what I do feel like here
47:24 this is amazing what I do feel like here is everything that’s everything co-pilot touches is like this is a Gartner feature to some degree like it’s like look at us we can put natural language anywhere we want anywhere in our tool and no other tool in the world has this co-pilot embedded into it and I would agree if Microsoft is able to use co-pilot in efficient ways right I want co-pilot to document my model write me a description for every table I want co-pilot to document my measures write descriptions in my measures syntax those things right so if I’m getting so to me
47:57 things right so if I’m getting so to me where I see it as a developer I see immense amount of value bringing co-pilot into those experiences hey Power GRE or co-pilot like let’s say you get a report from someone else and you go into the M editor right and you’re in the advanced editor you’re say explain this code to me and it goes through line by line here’s what we’re doing ding ding ding ding ding all the way through right hey you go into M and you notice that there’s multiple remove or rename columns throughout the throughout the query hey co-pilot remove any unnecessary steps
48:29 co-pilot remove any unnecessary steps from my M code right Hey co-pilot write my Dax statement more efficiently right there those are opportunities where Microsoft could really bring the power of co-pilot I think right next to where we’re having real problems and real real issues that’s I think where the impact is going to come from and I and I think if Microsoft GA gathers that if they’re able to integrate it in that level of way I think we’re really going to see some very happy customers Mike I hate that you’re right I don’t know if you
49:00 that you’re right I don’t know if you tested out the power apps co-pilot features like wow built me an app and I asked for something else wow it built me the same app I I don’t know yet so the thing is Microsoft find the lowest hanging fruit that is redundant and silly and I don’t want to spend my time on right I don’t want to go in and optimize documentation I don’t want to be I don’t want to be going through and writing a whole bunch of comments all the place there’s a whole bunch of places where you can write descriptions just do it do it for me let me review the output of what you’ve already done you have models with lots of tables lots
49:32 you have models with lots of tables lots of columns and hey Microsoft R Rod co-pilot hide all the columns that are not used in this report how amazing would that be like I think we can get there though yeah and and I think I I’m slightly optimistic not I’m more with you than Tommy as far as like getting super excited about smart narratives but at the same time like if if smart narratives were are designed to bring out key insights keep it simple right give me a give me a smart narrative that tells me like what am I
50:02 narrative that tells me like what am I looking like what where is this data coming from what are the business logic pieces how is it getting filtered right in a business language speak right like because then I don’t have to do that all of the things that we talked about related to end user training and understanding the scope and where like what are the specific areas of the report like you could easily show that to somebody if you understood the lineage of like all the things that are getting plugged together and transforming that data and summarizing it in some way so it’s like this is what this is here’s the calculation that
50:32 this is here’s the calculation that you’re looking at Yad y y and then the person the end user could be like oh okay well that makes a ton of sense or why are they doing that and it would adapt and change which means I don’t have to update documentation so like if you look at the power of llms I think across the board
50:50 power of llms I think across the board when we’re talking about Ai and people being excited about it is it if what you’re talk like the the bullet points the context it can create all of these all the fluff right like all of the extra I I need this in a job description format I need this in yes you describing this thing for me and just lay it lay it out so some lay lay person can understand what they’re looking at yep generate some stuff for me that’s just going to be time
51:21 me that’s just going to be time consuming for me to do and time consuming for me to maintain those are the big wins that are happening right now I know what what It’s the same thing it’s the same mean it’s the same thing it’s the same thing in code though I would Al I know what I want to do write the specific code for me I’d also go down the route of like is you’re saying this set I’m having visions in my mind of like where this could go right I should be able I should be able to use things on top of a visual I should be able to make a table visual Put The Columns of data down so I can see The
51:51 Columns of data down so I can see The Columns of data that are being presented there and I should be able to say from this table make a bar chart where the legend is this the xaxis is this and the y- axis is that and it should know change it over and it should just build all the things I want in that thing and and I should be able to why not use co-pilot to do stylizing on your visual like make this like a make this visual at dark theme and Center the text on all visuals it just knows where to do it like it there’s there’s Mass edits across the report that you could potentially imp Implement with co-pilot
52:21 potentially imp Implement with co-pilot things and I think I’m definitely taking the co-pilot side from the accessibility from a consumer point of view too how to make that a better experience for the consumer but guys man there’s still a ton of things I think any any of major other things Mike that you’re seeing yeah there’s I’m really thrilled about all the things that are like fabric that’s coming out there’s a lot of like service level features that I think that are very important here I think this one one L integration with import mode for semantic models is going
52:52 import mode for semantic models is going to be interesting I’m looking forward to seeing how that works there’s now sharable cloud cloud connections for semantic models that seems very interesting the semantic model scale out was something I talked about earlier that’s going to be very helpful so I think there’s a lot of really cool things here I will also note here one of the conversations that I’m having there if you go to the Microsoft blog and look at all the things that are there you’ll notice there was a a very Microsoft fabric explained for synapse users and there’s a couple notes
53:23 synapse users and there’s a couple notes here that I want to talk maybe some maybe some it’s not the darker side but like another side of the world here right there’s so much excitement around fabric that I think a lot of customers are feeling like hey we’ve already spent time building synapse things what’s going to happen with synapse because a lot of the same synapse features are now directly related an inside fabric based on the articles that I’m seeing on the website it sounds like there’s even a I think there was I don’t know it sounds like there’s a
53:53 don’t know it sounds like there’s a little bit of nervousness on the synaps users that have already purchased this stuff and saying are we going to lose functionality are you killing this product is it going away and it sounds like no it’s not going away but it also sounds like they’re not going to do a lot of net new development in it you’re not going to get a ton of new features inside synapse so as a as a advice to people who have built things or look at that one this might be an opportunity of the days are numbered potentially around synaps related things there there needs to be there is still in my opinion there’s
54:23 there is still in my opinion there’s still some feature gaps between what does really well and what fabric does for example there’s a SQL serverless engine here that comes along with synapse and the billing method by which you get data out of synapse and into powerbi data sets is actually fairly cheap so if you design your system well I really like that feature of synapse so the ability to run a pipeline the ability to use a SQL endpoint to go get data reviews and create them inside power Bay reports or models it’s great I really like that feature of it so we’ll have to see where this evolves inside the lens of fabric oh the
54:56 inside the lens of fabric oh the thing I didn’t me mention that we should definitely talk about is Microsoft announced the reserved capacity pricing for fabric it was announced by air on a tweet and surprise it is the same price as a premium P1 so if you want an f64 pricing it’s within $5 of the equivalent pricing for a P1 so pay5 more dollars per month and you get all the fabric things and you’re paying the same price is a P1 which logically in my mind
55:29 price is a P1 which logically in my mind if they’re trying to move you more towards this CU capacity level it’s great it just means now you’re going to use more compute for more things yeah and now you’re gonna have a harder time managing it all the dumb stuff the the other fantastic part of that though is you manage your capacity p1’s p1s are all managed through 365 and typically negotiate right like you don’t like I think I think that’s I didn’t see the note on that one around how you purchase the
55:59 that one around how you purchase the reserved capacity for fabric oh okay so that is so there there is still as you’re purchased and then there’s Reserve capacity I don’t know the details on how you purchase the reserve capacity we’re just talking FCS we’re not well it’s is that incremental year purchase yeah it’s like you buy it for a year thing and they the reserved pricing comes down as it’s not about sell is saying it’s coming from the Azure portal so won’t be through office sweet interesting worth it okay converting tomorrow converting tomorrow so this is this is an interesting
56:30 so this is this is an interesting purchasing me okay so I didn’t read the article on that I just saw a tweet from Air come out about the reserve prices SC the actual prices so that’s awesome to know thanks Alex for the filling in the Gap there on that one but that’s that’s a huge win for I think a lot of us because now we can then very quickly evaluate whether not we want to stick with fabric only in the Azure portal which makes sense a lot of these features are all Azure based anyways behind the scenes so I think it it’s a good what’s crazy what’s crazy and good for us I guess is the announcement here is
57:01 us I guess is the announcement here is probably the equivalent of a novel there are so many engineering like updates and things across the ecosystem fabric that we haven’t even touched on today so lots more to talk about lots more to dig into I I think Mike for sure post the post the link to the the overall announcements in in the chat but there it just goes on and on and on and there’s tons of things new capabilities that have been lit up before GA and a lot more to come yeah so if you let’s do
57:33 lot more to come yeah so if you let’s do some final thoughts here let’s wrap up and just get our final thought thoughts together on what are the main key points here so final thoughts for ignite Tommy what do you think I am incredibly pleased with it and I think for a lot of them a lot of users you may feel that overwhelmedness again just like you felt a build don’t worry there’s a lot of information focus on the big things right now continue to do what you’re doing and you’re learning you don’t have to know everything right now there’s power query things there’s notebook things there’s new
58:04 there’s notebook things there’s new data gen two data flow things focus on continue where you’re going with your adoption of fabric you get the handle of it so Seth what do you think for final wrap-up here what what are your main thoughts here I’m excited that it it win is GA as fast as it did I think co-pilot and all of the additional things that’ll help people build Solutions across the platform are are going to be very unique to this Analytics tool and powerbi being part of that is really exciting
58:36 being part of that is really exciting love to see the continued enhancements around visualization but o overall very very happy with what I’m seeing so far and excited about it very optimistic around what co- Pilot’s doing I hope it’s going to actually provide true AI based Capac capacity on top of things so remember that’s a good point Dan reminded me here in the chat make sure you go out and promote explicit measures pcad let’s try that again not that not that don’t write that one down use the hashtag explicit measures podcast go
59:08 hashtag explicit measures podcast go promote it tell us why you like it on Twitter and Linkedin Tommy’s watching the hashtag he’s going to count those up you get entered in for a free week of tips plus if you’re at a conference and you want to get more tips plus go find a sign somewhere on your conference take a picture of you add that sign and then hashtag it explicit measures podcast we’ll C We’ll add a a week-long subscription we’ll reach out to you we’ll get you some tips Plus Membership also if you want two weeks free of tips plus go find a PM somewhere
59:39 free of tips plus go find a PM somewhere in microsofts events go find a PM from Microsoft take a picture with them and hashtag us with explicit measures podcast we’ll identify you and we’ll try to reach out and connect this is all work that Tommy will be doing so if it’s wrong if it fails it’s all on Tommy’s fault so with that thank you very much we appreciate it please share the podcast we really like this this has been a lot of fun chat has been amazing as always love the thoughts and comments are coming along here I hope you’re enjoying Fab the fabric announcements and particularly the ignite event this this year looking forward to more
60:10 this year looking forward to more announcements and more deep Dives as they continue to announce things with that Tommy where else can you find the podcast personally the fact that we’re doing two announcements bigger than ignite a little little bit features here way bigger way bigger you can find us on Apple Spotify ever get your podcast make sure to subscribe and leave a rating that helps us out a ton do you have a question idea or topic that you want us to talk about in a future episode head over to powerbi tips podcast leave your name and a great question join us live every Tuesday and Thursday a. m.
60:41 every Tuesday and Thursday a. m. Central and tell all your friends join the conversation on all powerbi tips social media channel I feel like saying mediocre questions also welcome you could ask a great question or one to not gonna get on the podcast well maybe maybe we’ll maybe we’ll do like the worst questions we ever had written by Tommy you never know anyways thank you all very much we’ll see you next
Thank You
Thanks for listening to the Explicit Measures Podcast. If you have a topic you’d like us to cover, drop it in the suggestion link above, and we’ll add it to the queue.
