Exploring Direct Lake and One Lake – Ep. 221
Microsoft Fabric is moving fast, and two of the biggest concepts people keep hearing are OneLake and Direct Lake. In this episode, Mike, Tommy, and Seth work through what those terms actually mean, what problems they’re trying to solve, and why the details (shortcuts, security, and data formats) matter more than the marketing one-liners. They also share a quick update on themes.powerbi.tips—new features that make it much easier to build and maintain Power BI theme files without fighting JSON schema changes.
News & Announcements
-
OneLake overview (Microsoft Fabric) — A solid starting point for the ‘OneDrive for data’ idea: one logical lake that spans workspaces, with shortcuts that can point to internal and external storage without copying everything around.
-
PowerBI.tips Theme Generator (New Icons & Uploads) — They announce big upgrades to the theme generator—an icon library you can embed directly into theme JSON plus the ability to upload an existing theme, validate it, and edit it in a built-in code editor.
Main Discussion
OneLake is the storage foundation Fabric is building on: a unified place where data assets live and can be discovered. Direct Lake is the performance-oriented way Power BI can query Delta Parquet tables in that lakehouse-style storage without the traditional import vs DirectQuery tradeoffs.
- Treat OneLake as a logical layer: your day-to-day experience is navigating folders and items, while Microsoft manages the underlying ADLS Gen2 plumbing behind the scenes.
- Shortcuts are the key abstraction: they let you reference data across workspaces (and potentially across clouds) without duplicating files—great for reuse, but it raises important cost and governance questions.
- Expect security and organization to evolve: domains, tenant settings, and group-based controls are going to matter more as OneLake becomes the place everyone points to.
- Direct Lake isn’t ‘all files everywhere’—it’s tied to supported table formats (Delta/Parquet) and the lakehouse/warehouse patterns Fabric is optimizing for.
- If you already have Delta tables in ADLS Gen2, the interesting play is using shortcuts to bring existing tables into Fabric’s orbit instead of rewriting everything.
- Be realistic about performance and cost boundaries: even when you’re ‘just reading’, cross-region and cross-cloud access can introduce transfer/egress costs and latency that you’ll need to test.
- Don’t let naming confuse the architecture: OneLake, the data hub experiences, and Direct Lake are related, but they’re different layers (storage, discovery, and query mode).
Looking Forward
If you’re adopting Fabric, pick one or two concrete experiments—create a lakehouse table, add a shortcut, and test Direct Lake behavior end-to-end—so you learn the constraints before you scale the pattern across teams.
Episode Transcript
0:30 good morning and welcome back to the explicit measures podcast with Tommy said Mike hello everyone welcome good morning gentlemen good morning and I I happy Tuesday to you and everyone else listening here we go joining in back in another week again here we go it’s getting rocking and rolling slow down right I know it feels like I was up late last night getting stuff done yeah work was calling I heard the call and I got stuff done so
1:02 I heard the call and I got stuff done so it was a late night for me last night so it feels like it already feels like it’s been two days man I I feel you I am completely on the same page as you looking at a few new tools obviously all the things with fabric fabric lots of things around fabric a lot of things a lot of fabric I just got into the github’s co-pilot Labs or the co-pilot chat and vs code for insiders yeah which is amazing there something else came out I don’t know if this is a desktop or at least that just came out but there is now sync your phone or your
1:32 but there is now sync your phone or your iPhone with your computer now that just came out for Windows which I found very fascinating and I’ve been playing with that so now of course because I need more notifications on my computer while I’m working all day I can now make a call from my phone I can now can now get notifications from my phone directly on my computer and in Windows 11. I thought it was really interesting so it’s like a Bluetooth connection to your your laptop or your computer and then it synchronizes notifications from your phone which I thought was really cool it’s a little limited like on the
2:04 cool it’s a little limited like on the Windows 11 because you can’t do group chats okay like one-to-one it’s not it’s nice I don’t know why we’re in 2023 and we have some co-pilot in chat gbt everywhere yet there can’t be a Windows application for iMessage when can when can chat GPT come to iMessage that’s that’s what I really need like the family texts that are like hey did you see my new kid thing that they did I just want to chat TPT like all the paragraphs of things that just say something pretty and nice
2:34 something pretty and nice just respond back nice there’s finally an official open AI chat GPT mobile app oh a new Mobile app all right really it’s ramping up I I listened to an interesting podcast the other day I’m gonna have to go pull the link for it because I don’t really have it was it the exclusive measures podcast well I I do I do regularly actually you’ll you’ll be shocked when the when I that was never listen to the podcast after occurs like it’s it’s done it’s
3:04 after occurs like it’s it’s done it’s off I’m on to my next thing I’ll have to go find this other other YouTube video I was looking for but they were talking a ton about Ai and I’m I’m very I’m very not not pro but I think I’m very positive to what it can do I think there’s going to be a lot of really interesting things that people are going to build I think there’s the ability to build junk and bad things but I’m very optimistic on people who build good helpful tools and I think it’s going to be part of your workflow it’s gonna have to be at some point I agree it’s gonna have to be
3:35 I agree it’s gonna have to be whether whether we have an influx of people that just start using it I think is dependent on whether or not the tool sets just integrate how well they integrate with existing folks and then Elevate them as opposed to or just do everything for them maybe it maybe it’s chat maybe it’s chat based because like you can’t I suppose you can mess that up but whatever it is definitely going to be an accelerator but yeah what I’ll find interesting is if you think about
4:06 interesting is if you think about technology and the pace even while we’ve been in here yes right yeah and how a large populace is not even engaged in this is technology Revolution like this is going to happen so faster that I fear I fear honestly what it’s gonna be like do to some folks it’s like and I think I think what I’m potentially feeling here is there’s going to be a larger separation between people who actually started to learn Ai and certain to
4:36 started to learn Ai and certain to leverage it versus those who did not take the time to engage and learn prompting it’s already changing how I use with it well yeah I I think I think I feel like this is the A lot of people are saying like the next 20 or 30 years is like another not Revolution but like we had like the industrial era we may have the AI era coming up here shortly right there there is going to be the Google reprogrammed my brain for sure when you started using Google yeah you you were you were searching for things differently right
5:06 searching for things differently right the information age was there all the information is available to you businesses and and the podcast I was listening to was talking about businesses have to reinvent themselves like every three to five years now and if you if you’re not like regularly tearing down your business and figuring out what is the core what you do and how do you reinvent yourself particularly in the technology space I think it’s faster there than in other Industries and my wife and I were talking about business we love working with businesses that have apps if your business doesn’t have an app and I can’t order things from you with an app I don’t want to do
5:36 from you with an app I don’t want to do business with you anymore like it’s becoming so even though I buy you if you don’t have a website you don’t exist yeah like it’s so true tonight I’ll make a recommendation we should table this conversation and have it in a longer format because okay we have a lot of things we have a lot of things covered today that’s true we’re not making puns today no punting no no punting I puns today no punting no no punting we are pretty punny we are but you mean we are pretty punny we are but you know anyway what else what else we got going for right for right I got a major announcement this is
6:07 I got a major announcement this is what I’m just super excited about the power bi tips theme generator so themes. power bi tips just came out with two amazing huge updates over the weekend Monday we did we just finished up some releasing things so I just want to really point this out if you’re not using the theme generator today go check it out if you are using the team generator there are some new features you want to check out we just released 8 000 icons that you can add to your theme file so in theme files people don’t know this
6:37 in theme files people don’t know this typically but it’s more of a nuanced thing you could go get an SVG that you build you could add it to your theme file and that SVG icon could be accessed inside your power port helping you build just indicators icons inside your your reporting fully customized icons right so yes apply your theme colors to all of the different icon shapes so what we we decided this is too hard again too much code so we built a library of 8 000 icons we have release
7:09 library of 8 000 icons we have release them you can change the color of the icons you can change the name of the icons and it automatically drops them right into your theme file so that’s a huge win right there by itself and the most number one requested feature was the ability to upload your own theme file so that is now available inside the theme generator you can go upload your own theme file you it will code check it for you it will let you Minify it it’ll let you expand it prettify it or beautify the code so lean into that right oh yeah so
7:39 code so lean into that right oh yeah so cool upload but the experience is such that your upload pops up in a code editor yeah and will tell you where the errors are so if you if you’re in line with row and want to use a code editor alongside the theme generator that is now fully baked into the tool and it’s amazing yeah and at any point in time wherever you’re at in the tool you can actually click the code editor and you can go to the add your colors you can click the code editor and you can see the array of code colors
8:10 you can see the array of code colors that are there and you can edit that code and hit OK it will then propagate the changes back into the tool so if you want to delete some colors if you don’t like a color if you want to move stuff around or rearrange things you can and you can see what code is generated so those are four tips plus subscribers so if you are tips plus subscriber you get all the new features you get a sample of icons if you’re not a tips plus subscriber but we want you to to get it we are doing a lot of work here we’re doing more Dev work we’d love you to support our team as we continue to build
8:40 support our team as we continue to build more features into this tool to make this one the world’s best theme generator so we’d love your support if you can it’s it’s fairly cheap it’s two bucks a month if you buy yearly it’s three bucks a month if you buy monthly so I try to make it easy and economical for you to jump in and start using themes anyways we hope you find Value from it more to come on the YouTube later I guess tell me I love the upload I love the upload right so much I think that
9:10 the upload right so much I think that I’ve been because I was adding all these issues or or feature requests to the GitHub repo and involved solved the duplicate would be nice but this will be great this is great too this yeah so it’ll be at least yeah we don’t have a duplicate feature yet so you just can’t copy a code a a theme yet but this is probably your eases download it real quick upload it again you can duplicate it that way for now yeah and again the biggest thing is I I what I was realizing because I’m using the theme generator it’s funny and when
9:41 the theme generator it’s funny and when Microsoft did the change to the new schema it made basically trying to edit the Json file which is what I normally did almost impossible yes and then you updated your the theme Jenner right away and I realized like I have these since Global properties for like that I want to do for everything for background for for tables and I’m like I have to recreate this every single time because I wanted to change the colors out yep not anymore nominal phenomenal anyways
10:11 not anymore nominal phenomenal anyways we’re very excited about this new feature we think this is going to be great go check it out I’ll drop the the URL in the in the link here below so just go check out themes. power bi tips if you want to go look at the Tool there’s a lot of free features that are already there for you and then if you want to get a couple enhancements upload your own theme files code editing all that other fun stuff you’ll be able to check that out at the theme generator so we really appreciate you awesome one other a couple one other
10:41 awesome one other a couple one other thing I found really interesting I just want to call this out here across the internet internet I browse LinkedIn all the time I know
10:47 I browse LinkedIn all the time I know you I don’t know if you guys do LinkedIn checking thing like I’m always on LinkedIn seeing what articles are posting yeah and every so often I get a comment or something and I found this gentleman’s tagline to be incredibly I don’t know I just thought it was very I thought it was very well written and maybe this is stolen from somewhere I don’t even know but Aaron you did a great job so Aaron bernello a gentleman out of the greater Chicago area so here based on his LinkedIn profile so near Tommy the homeland of Tommy
11:17 homeland of Tommy he says he says data without actionable insights is just overhead overhead and I was just curious your thoughts on this do you guys agree data without actionable insights is just overhead I thought man that’s I thought that was pretty profound if you’re if I like it if you’re not building stuff to take action well it what are you building but that yes it’s exactly it’s our argument of of why
11:48 exactly it’s our argument of of why business intelligence and our profession is is like valuable a thing yeah right I just thought it was very Consolidated like I had to think I’d read it like a couple times I was like yeah that’s right right and I was trying to like overlay that thought against like how much content is being made in your organization that is just over how much waste is being produced that is just overhead and not just that the the idea that everyone wants to see their data but it’s volume its volume it’s these aggregate numbers
12:19 its volume it’s these aggregate numbers that what can this process tell you and yes I’m realized more and more we’ve said it a bunch of times on this podcast but like our skill in terms of the personal the individual who can really affect change obviously anyone can build a bar chart it’s easier and easier now nowadays too and yeah technical skills with modeling but I think this goes back to something I’ve been preaching about our role in an organization with bi is not just to here’s a number that you
12:49 not just to here’s a number that you want total sales but to actually say like again think about the threshold thinking about the pressure points what are you really trying to see it doesn’t you may not need to see all the time the exact number there’s Financial reports for that for the actual in a sense organizations total numbers but reporting shouldn’t just be that over and over again so more and more on like where if this number would continue to grow where is the fire is that great is that bad and I
13:20 the fire is that great is that bad and I think us being taking a step back and also users like hey we have reports you can see your general performance over the last year but what we’re going to focus on is what are we really trying to measure what are we trying to re really reduce uncertainty I would agree with that and I think that this is also I think speaks to a lot around what are your goals what do you need to happen to make things a success this really speaks to Seth what you’ve been talking a lot about or we’ve talked a lot in the podcast past prior all the
13:51 a lot in the podcast past prior all the okrs do you have a key performance indicator and is there a time-bound component to it where does that live excellent excellent so super thrilled about that before we get too far down this rabbit hole and do intros for the full episodes lots of rabbit holes a full a full intro the May desktop the May power bi desktop did come out we are now in June but now I’m we’re now I’m getting to talking about what the May release just had so bear with us
14:21 May release just had so bear with us while we catch up here a bit so anything that stuck out to you gentlemen around things that came out in the May release that that oh you thought oh that’s interesting let’s talk about that yeah yeah a lot of good accessibility updates updates so so nothing like that’s brand new feature thing but it was it’s good to see and I think they cycle through this once in a while where they add a lot of those types of things in yeah yeah which is always super helpful for
14:51 which is always super helpful for developing reporting right we have to be always cognizant of many different audiences yes and ensuring that everyone has the opportunity to digest the information that we’re trying to convey at all times so so a lot of good stuff to see there I was I’m very happy to know that they’re updating on visual editing experience I know at some point they’re going to kill the old editing experience and it’s just gonna go live I’m happy that they’re they have to be taking
15:21 that they’re they have to be taking feedback because it’s it’s rough it was rough when it came out initially just needed some more love so I’m definitely very pleased to see that there’s continually they’re listening to feedback things are getting a little bit better better the editing experience like windows are staying on longer one of the features I can’t stand around this new on visual editing is the suggest a type the suggest a type of visual I get they’re trying to help you out and for new users I get it makes sense I’m not a
15:51 new users I get it makes sense I’m not a new user I’m an experienced user I know what chart or table I want to put on stuff and every time I would try to switch something it keeps suggesting stuff for me I’m like I don’t want that I want it off so they’re now as a setting inside desktop that turns suggest a type off by default which I think is a very good feature to turn off so that way it doesn’t keep recommending random things that I don’t want they gotta make that window larger because it literally is a slightly more advanced version than the personalized
16:21 advanced version than the personalized visual in the service that was meant for consumers for that has no idea what they’re doing that’s what on object is and is so frustrating because yeah I need a bigger window for all the settings if I have custom visuals in there too I’m glad that auto suggest is off but even that window still when I’m trying to choose or update a like just trying to add tool tips or trying to add the objects you have there’s no there’s no way you’re not going to scroll scroll just even updating a field or
16:51 just even updating a field or property and then it disappears you have to add it again there’s a lot more clicks so I think they’re listening to feedback which is great so we’ll continue to see that I’m personally excited for two things okay the Azure Maps is generally available which has been really my default from anything that I want to do from visualization with maps Azure Maps used to only provide if you had a latitude and longitude but now it does all the types for zip country
17:23 types for zip country Etc it’s a great Visual and I think it’s much better than the the standard one but I think probably the coolest underrated one is the measure driven data labels which is just one of those little nice touches that really can enhance things where it’s bit brief background if you have a value in let’s say a bar or a line chart data label is always going to be obviously that shows up on the visual yep whatever that measure is but now we actually can create or in a
17:54 but now we actually can create or in a sense use a custom or do do our own so we actually say hey the field should actually be let’s say percentage difference so something custom created obviously we use the context of that Visual and where it’s at but this is really really one of those things that goes also to the idea of just data without insights right where those numbers if you’re looking at email sentence or if you’re looking at like the number of leads over each
18:24 like the number of leads over each month 26 month 26 000 versus 25 000. well is that great that doesn’t really help too much unless you have a Target so there’s a lot I I’ve seen one blog article on this this but I think this is something that people really haven’t discovered yet it’s a good call out anything anything else set that stood out to you and that you felt like was a win no and I missed the really good one so I must be tired
18:56 well a lot of it you’ll you will notice there’s a lot of stuff coming from fabric so there’s a whole bunch of there’s Now power bi direct Lake this is one of the topics we’re talking about today that’s now in preview you’re getting SharePoint and OneDrive integration as preview so there’s a lot of preview features that are coming that have been announced at build that are now starting to show up inside desktop so be aware there’s a lot of preview things that are turning on you can go try those things out now yeah in in all I feel like oh another one I thought was interesting here was setting query limits with power bi
19:27 setting query limits with power bi desktop do you guys see that one your desktop is very powerful and it has no holes barred on how many cores or how many queries it can run and you and I’ve seen this a lot of times you publish reports to the service you hit it a couple times and there’s a query or a table or something in there that says whoops you ran out of memory pause you can’t you can’t you can’t run any more data and you’re like what the heck what’s this error for it worked fine on my desktop now it doesn’t work oh Donald yeah you pick up a good one
19:57 oh Donald yeah you pick up a good one Donald chat Donald picks up git integration isn’t horrible a good feature that’s coming so there’s a fabric preview feature we can go look at some git integration that’s coming out through through the Azure portal sorry the power bi portal using fabric at this point point yeah I’m really excited to see where that one goes cool cool any other topics you want to cover for intro pieces I think that covers it all right let’s move on to our main topic of the day day so diving in a bit more to
20:29 so diving in a bit more to fabric I’ve been seeing a lot of comments threads people chatting about things directly around what is this one leg thing what is direct Lake what the heck are we talking about here so maybe we should just go through and briefly do a little overview of let’s talk about a part of fabric that is now this thing called one Lake and where does that fit with power bi now and how does this one like ecosystem work with another feature which is a feature not necessarily a product or a
20:59 feature not necessarily a product or a part of product which is now directly so talking on those two items or should we start with this one Tommy we’re going to start with just we start with the the probably defining what is one leg first and then we maybe talk about directly a second so one lake is and I’ll I’ll do the best job I can with this but it’s really in Microsoft fabric going to be I’m according to Microsoft the OneDrive for data which I’m sure you love but
21:30 for data which I’m sure you love but they’re pushing that idea where really all properties elements what we do from
21:36 all properties elements what we do from our analytics in our Source data is in one area so really it’s rather than if we had to do multiple copies if we had to do a lot of Transformations we can actually have in one like one source even with Transformations even despite if it goes to other areas as well so I know you’re yeah and you were picking on me Tommy yeah yeah yeah you’re picking on me about okay this is this is one drive for a lake information and I think what you were
22:06 information and I think what you were picking on was I really dislike the analogy of well power bi is just a PowerPoint yeah okay so I know where you were picking me here so it is like it’s driving into the this is everything we’re doing have to be related to an office product now I know I know so so what I will say is I don’t really love PowerPoint I feel like it’s bloated it’s I click I feel like I do so much clicking and not a lot of building
22:37 much clicking and not a lot of building so one thing I have a strongest taste there there I will say the OneDrive product I really do like and so I think in if you’re aligning the one like experience to one drive for your data I do like that analogy and I think that does make sense that the idea that you can have this centralized Lake thing that lets you collect all your information and it easily integrates with your desktop it it synchronizes files between your machine
23:08 synchronizes files between your machine and the and the cloud and you can edit them or upload them as needed I think this is real I would imagine again I don’t know the technology stack behind the the code that Microsoft writes but I would imagine that OneDrive is technically just using blob storage behind the scenes and it’s just an interesting integrator between a blob storage account and your computer so like it makes to me it seems like it it very much makes sense and they’re evolving the OneDrive product to what I think it really should be is
23:38 really should be is think about your your entire company like the amount of times I’ve had a computer die on me and I just go not worried about it everything’s already in OneDrive and then I just reload my computer log back into OneDrive sync my folder and I have all the links to the information that I need this is very helpful so I really like the idea of not keeping information on your machine and it should be going somewhere in the cloud that is best practice I would think at this point so Tommy you ripped me on that one but I would agree I think I think this is
24:08 would agree I think I think this is this is like the analogy but I do like the one like being a better or more evolved product than a one drive and I think I would for sure move from one drive on to one leg if that meant I could still do all the same things I could normally do in one drive as well as well I think the concept is interesting and it further it it even further challenges a lot of the organizational structures that exist in in companies today right where you have
24:40 companies today right where you have you have administrators of these resources that get spooled up and even even for companies that live in Azure and we’ll just say Azure is the the best and only Cloud platform out there there our Azure administrators right yep have yeah have been managing resources right and these are data Lakes where people can already put tons of different files and one like challenge is that and I even I found it interesting that even in
25:10 even I found it interesting that even in the technical documentation it was well you don’t have to worry about that anymore nobody can block you guys from having a lake because the minute you turn this on there’s one like and that one like is in the tenant right that’s the Azure tenant at the highest level your aura gets one one one like what would I have some interesting questions about is what if I have multiple tenants within my own company I still have to explore what one one
25:41 I still have to explore what one one late one one like yeah but I think that that to me just on a base level right it is this one entity one service that is going to house all of the objects and artifacts that you create within the fabric ecosystem ecosystem [Music] [Music] so yeah and I think of impacting yeah like or or changing how things work within a company I I think is already it’s breaking Minds right like how do we
26:11 it’s breaking Minds right like how do we figure out how do we how do we not I like putting guard rails or costs and all this stuff which I I expounded upon a while ago when Alex was joining us but it’s like there’s all these considerations that are that have to be accounted for and walked through and talked about but it is it challenges that way in which which you’re bringing all your data together which is I think fantastic I think it opens up a whole slew of like really interesting opportunities that
26:42 really interesting opportunities that we’ve talked about as challenges before and hopefully it solves some of those because the only way to do this and and how it’s typically implemented in organizations now is you have different lakes for different groups right and there’s always challenges about like okay well now I have to create a new ADF Pipeline and I have to move things over here and we’re actually moving data from one place to another this this changes a little bit of that mindset it happens another thing that they announced during my build which I
27:12 they announced during my build which I thought was very fascinating was the con like so if we think about one Lake and we think about gdpr and where data has to live and when you start working internationally this this data situation becomes a bit more challenging because the data must live in the country that it was generated in so if you have countries over in Europe they have a stronger gdpr requirements around if you collect data from people in that area that must stay in that area and so the one like element here it sounds to
27:43 the one like element here it sounds to me like they’re also doing behind the scenes a little bit is there’s like a thin wrapper of here’s a bunch of access points or end points you can go get data from by the way we’ll keep the data in the region that most makes sense where that data needs to be accessed from and you just access the one Lake endpoint but they keep track and put the data into the right region or location so there also seems to be something here where I think of one like in my mind if I’m trying to like my mind’s eye about what this thing’s doing it sounds to me like one like is a collection of
28:13 to me like one like is a collection of storage accounts or Azure data Lake Gen 2 accounts behind the scenes that Microsoft is managing for you and all you’re doing is saying oh I’m just going to access this file in this folder and Microsoft self figures out okay well we know that’s just a shortcut to where that file lives here it is in any one of these regions we don’t really care where it comes from but that makes sense that totally makes sense right and and it it absolutely does because and and Raphael makes a really good point here because it it transforms the existing implementations that we have
28:44 existing implementations that we have from platform as a service right where you’re managing or you’re instantiating and building these Services you’re controlling permissions Etc to a full SAS yes software as a service right so it’s right okay we’re taking care of all these things for you I think the the gdpr thing will be interesting where people dive into like is specifically in one Lake is this place here or is that workspace driven and the idea that there’s this concept of shortcuts right yeah where I think the one like
29:16 right yeah where I think the one like having a centralized location for all your data assets as well as the ability to connect to even third-party data sources and not pull the data in like okay the ability to yeah through through this shortcut is I think also where this would play in even internally where I can create a shortcut between workspaces yes right and and maybe that’s that speaks to okay well this is this workspace this is this
29:47 well this is this workspace this is this is delineated to this region and that’s where the actual data is stored Now Let’s ignore the fact that if you’re viewing data that lives in Europe on your computer data may be getting like like bits bits don’t just show up on your capri screen without something passing like going through channels but hey let’s just say the data still just lives in in Europe all by itself and and that’s yeah that’s where it is so this I think is going to
30:18 where it is so this I think is going to bring up a lot of really good so at the core of what I’m seeing here is again to your point Seth around accessing data from other clouds I’m very interested to see what the data transfer cost what that would look like and I think this is also going to be requiring you to think smarter about how you design things because okay I’ve got data in AWS S3 right that’s one of the out of the box supported features you can go to AWS S3 storage their storage account basically connect to it and then go access data or files that are coming from there very interesting
30:49 are coming from there very interesting which also means now you can go build your own Delta Lake tables over there and then you can go read them right into Power bi via direct link this is very interesting so now you don’t have to physically move the information the data which I think for a lot of companies will be an advantage but there’s going to be a cost associated with that like to your point those are well I think there’s some really big points there to okay no I was just going to say it but there is no data transfer cost well you’re always going to be probably
31:19 well you’re always going to be probably going to be compute well there is a data transfer cost because if you’re anytime you access there’s an egress and Ingress for everything that’s that’s from AWS side that’s the beauty of shortcut though though so but you’re interesting if you’re saying like AWS would charge me well yeah of course you’re actually moving data I’m just reading it correct so I don’t know if there’s going to be a cost on the Microsoft side because they’re going to be spending to your right you’re right there’s a compute cost to go read that shortcut that’s somewhere else right but every time you access
31:50 else right but every time you access like a blob storage account in Microsoft anything that accesses that account even out of the different region right you get a you get a transfer egress or Ingress cost from Microsoft so if you’re maybe not computers AWS will be like hey look you just accessed two terabytes of data we’re gonna charge you for the the egress of that read of that information and again it should be cheap right reads should be very cheap if you build your system correctly or you design your system around reading efficiently but there will be some cost associated
32:21 will be some cost associated with that so that’s yeah so Joe’s
32:22 with that so that’s yeah so Joe’s confirming AWS will charge you when it leaves sure sorry go ahead Tommy no I think there’s two really big points here and I think there’s gonna you’re gonna see a little more a big change to the UI because as you talk about shortcuts and also from the security that’s one of the huge Parts too of one Lake where utilizing domains they’re really updating the the tenant settings utilizing security groups but that domain feature with with one like is going to be
32:53 with with one like is going to be huge from an organization point of view I think the story that Microsoft’s seeing is obviously we’ll be using the workspaces but that one like data Hub it seeming seeming to more and more be not necessarily for the consumers because now you’re actually getting obviously you’re going to get the lake house in there there’s the data sets it’s not just a hub of all your reports I think initially the the data set Hub or the data Hub really was meant
33:24 or the data Hub really was meant to be for consumers to use just what data and what data sets are available but they’re really changing this to be like I said in a much like almost like our our Central Hub when it comes to all of our the engineering side focus on fabric another thing yeah let’s talk about your point first you keep going to point to point let’s let’s pull it back here a bit let’s talk about your first point there and again you’re excited about one Fabric and one like the Fabrics all the different articles so I
33:55 Fabrics all the different articles so I think what you’re pointing at here is is I think there’s I think there’s going to be an added confusion piece as we think about what one lake is actually doing because you’re right this one Lake terminology is not popping up all over the place like direct lake is part of the lake but it’s like a feature it’s like a connection type but it’s not necessarily one Lake one Lake there’s data one like data Hub whatever the heck they’re calling that I don’t have that what was it what was the name of it again
34:25 of it again the one like data okay one like okay so one like data Hub is like okay here’s a bunch of bunch of assets to Microsoft’s point right there’s going to be a ton of things that are created in many different workspaces and the idea is they want to make all this available to you so this is like I made a table it’s a Delta table in this workspace great well how do I how do I discover that stuff how do I know to reuse it right what’s the point of having a really nice product master table made if no one can access it or no
34:56 table made if no one can access it or no one knows it’s there like it’s like it’s the if a tree falls in the forest and no one hears it did the tree really fall like is it did anyone observe actually what happened there so it’s the same thing with your data so I while one like data Hub is a feature of power bi. com it’s not necessarily just one Lake it’s actually incorporating SQL data warehouses and data sets and all these other data asset things on a roll rolled into one thing so it’s really muddy in the waters around where does one product like one like end
35:27 where does one product like one like end and where does the next analysis services or SQL data warehouse where does the next product begin everything is starting to feel much more fluid together together is that what you’re saying Tommy a bit I think that’s a huge part here where the service is really this is the awesome part with fabric as much as it can be I think for a lot of people intimidating it is such an amazing story of rather than having to have 14 different websites or
35:57 different websites or portals open for I think you made a great Point our first time we talked about fabric where the users we don’t have to worry about the v-nets and the resource groups and all those additional configuration if we need to add data just to raw data because of the one Lake Explorer on Windows that which is already available to go and file explorer it’s not like oh I need to download the the storage Explorer from from Azure and then I have to push
36:28 from Azure and then I have to push things in there like I I needed a you things in there like I I needed a a data Factory or know a data Factory or pipeline for just getting the raw data in initially yes and now we can basically sync the that one the one like Windows Explorer just like OneDrive if people are adding raw files or just just that first step is huge yeah I think is a huge part here and again I think a big one one other item that I think a lot of
37:00 that hopefully is not getting missed is with one leak and all this you don’t have to do each of those engineering things for for fabric that they’re all obviously a lot of use cases but it’s not like oh in order for me to actively use one Lake I have to use synapse and the data Factory and the data yes there’s a there’s a lot of use cases where the only like you’re going to take a few one or two of those items yeah and I think what’s interesting to me is in in the chat’s getting hot around it too is Mike you make a really good point from the arena where if if
37:32 point from the arena where if if we’re in a shared workspace right if our workspace is traversing aspects of fabric and objects are created Created and they’re available to us for visibility which is one like this is obviously a good thing because now we all have the same set of objects that we’re dealing with yes and across teams those objects can be created within parts of this ecosystem within fabric right or modified or updated Etc and that’s where I think
38:02 that’s where I think this this this delineation between understanding that objects in one Lake are accessible but they are different and they like each one of these parts of fabric can create or create different types of objects right so if a user wants to upload a CSV file they can like it’s there’s no restriction from my understanding around like what type of data can be stored in one lake or available to us from a connection perspective however when we start
38:34 perspective however when we start talking about certain things like direct like like which is the connector from Power bi that is is going to provide a stateful view of of whatever the data in the object is now we’re talking about a specific type of storage within one lake that is really the accelerator for things like lake house and data warehouse which is storing this information in Delta parquet format now I I think in the documentation it says
39:04 I I think in the documentation it says hey if you’re gonna if you want to use direct Lake you have to like you use the lake house and you use that interaction point so that you’re creating a Delta parquet table table does that mean like you’re not going to be able to access through shortcuts other Delta parquet files that you may already or tables you already have within like databricks or other systems no you you absolutely should be able to do that yes what I need to still test out in performance wise is the performance of
39:34 performance wise is the performance of that type of stuff yes but but I think where where that shifts and like people are still just navigating through is like okay where do I create what how do it like what where if I need to do this thing is that something I need to build in in one Lake if I do does that mean I can do that in SQL Warehouse or no do I have to create a direct direct all these words all the words of all the in lake house and okay ultimately I think that’s where a lot of
40:06 ultimately I think that’s where a lot of the Nuance is going to be is like okay yes it supports all these things how we interact with them and what we create is going to be different based on the experiences that fabric provides because they’re still the same systems that were generated albeit better and more unified and easier SAS experience but like if you don’t understand the concepts behind those it makes sense to a lot to me because we’ve been using Delta parquet for a long time right so all these things apply and it’s like oh yeah okay that makes sense I totally want to do
40:37 that makes sense I totally want to do that that’s how I would interact with it but for for the lay user there’s there is Nuance in here and I think some challenges as far as like understanding what why what do I use what is this thing Etc so that there’s there’s a question that’s going on in the chat here I just want to address it just to be clear because I think many other people are going to struggle with this as well what in my mind if I’m taking like if I step really far back and kind like if I step really far back and look at the broad spectrum of what of look at the broad spectrum of what Microsoft has announced here what was the major shift that’s driving
41:07 what was the major shift that’s driving this this what is changing here okay everything like where does power query Gen 2 fit what does that look like what are all now we have these things called pipelines what are they doing okay now we have more access to this thing called The Spark engine what’s that hap what’s doing there what’s what’s that mean okay if I step way back way back one lake is just the unification of a bunch of storage accounts is how I see that the one like element but all these other tools now power query Gen 2
41:37 power query Gen 2 pipelines everything’s now bringing a lot more data directly into the lake house or the the one Lake now as Delta tables the major shift that I see observing here is Microsoft has basically said power query will now use Delta tables power bi data sets will also the vertopec engine will now have a default storage system of Delta tables because it’s columnar store and micro so to me that single shift between those two
42:07 that single shift between those two products now shifting under this open source Delta table and technology has changed the entire ecosystem because now SQL serverless can read Delta tables power query can read and write Delta tables power bi now reads and writes Delta tables so to me this is a major competitor between what Microsoft is doing and what snowflake or any other cloud-based storage system is doing this is this is the format that Microsoft has chosen to move forward with for all of
42:39 chosen to move forward with for all of their data ingestion tools so now you their data ingestion tools so now the questions around well what know the questions around well what about databricks it can just read it because it’s in the same format well what about all these other tools it all just works now so I think to me the major fundamental shift is Microsoft has really embraced this Delta table format format they’ve incorporated into all their products and so now we can now have all this stuff so yeah Donald you ask a question can databricks generate the order or is that Microsoft only my understanding is there’s a two sort
43:10 understanding is there’s a two sort orders that happen in these Delta tables there is a z order that comes out of databricks the V order is a Microsoft generated sorting that matches the vertapac engine so I think vertipac can read Delta tables coming out of data bricks because it’s the same technology the V order sort items I don’t I think that’s proprietary to Microsoft for now I don’t really know if that’s something that’s specific that’s going to be opened up and allow you to use that in databricks at this point the the feedback I got was you can optimize your
43:40 feedback I got was you can optimize your your databricks tables in by by partitioning them in a specific way but it there is a proprietary component to it right so you can get you can get good better performance out of Delta tables that live data bricks or 80s and two right they just happen to be created by databricks but yeah there is a proprietary component to it I think what there is one component there’s one comment I want to make a response to coming out from Jay Murphy so he says is so in in the future sense
44:10 so he says is so in in the future sense how will this make sense to the business yeah a lot of what you were talking about Mike is part of that because I think this is multifaceted answer and you guys probably have a lot of different ideas as well but you’re you’re removing barriers for requiring certain applications to be how you interact with data yes right so one across teams this just accelerates how fast data can be shaped and formed to be
44:41 fast data can be shaped and formed to be available and accessible to a business user as well as I think in the other direction business users utilizing power bi within the fabric ecosystem should be generating data objects within one Lake that would be accessible to the professional tier right so where where my mind goes with that and I need to flesh out is does that speak to this story where you can have business users and business
45:13 you can have business users and business units developing reporting and then understand what those data objects are that they’re using and potentially like migrate them into an Enterprise level thing if that report hits a scale right like we talk a lot about like where where would you start to promote a report throughout an organization where it’s like okay well I have a business unit now all of a sudden it’s multiple business units a whole bunch of people are looking at it or executive leadership got real interested in this one and they now want that as part of their executive dashboard yes rather
45:44 their executive dashboard yes rather than starting from square one as far as like randomly finding where all these data files are Etc the path for if all that’s in one Lake well now it’s it’s much more easily discoverable and maybe swapping those out would be easier Still Still I agree right but at the same time I think the the speed by which data can be available to people that are generating reports is one of the the biggest wins throughout this ecosystem right because you’re you’re taking a lot
46:14 right because you’re you’re taking a lot of the challenges of setting up infrastructure still and removing that as much as possible yep and then having them become more literate in this environment as well right you’re giving them a lot bigger tools that they can interact with and in some cases that that is still a blocker within orgs like I can’t do any like it takes forever to reprocess something or you’re pushing automation into you you’re pushing automation into everything that people do and know everything that people do and allowing I think a lot more opportunity for people to skill up in orgs too
46:46 for people to skill up in orgs too do you like to see any other other aspects of of the business user side of things in this I felt I felt like from that question I I felt I felt like from that question as you were talking there Seth I mean as you were talking there Seth I felt like this was another scenario where Microsoft took something like power query or store procedures right store procedures was this thing where I’m taking data from here I’m gonna do something to it I’m gonna put it over there that’s that’s the concept of what a store procedure is to me this feels a lot like okay we’re just going to take power query we’re going to update a little bit of the under the
47:16 update a little bit of the under the hood technology piece of it and so give you more point and click with Delta tables to me right so you don’t have to know that it’s know you don’t have to know that it’s parquet you don’t have to know that it’s de-ordering or the ordering sorting the tables right you can just go into Power query you can go load data from a server you can just say drop it here in this lake house and so under the hood there’s a lot of Technology things that are happening but you’re now able to literally walk through a UI which says Microsoft is really dang good
47:46 which says Microsoft is really dang good at simplifying these complex things to your point Tommy earlier right I don’t need to go get a synapse turned on I don’t need to get key Vault I don’t need to get an Azure data Factory I don’t need to get a lake started like all these things are now one button press Start workspace in fabric boom done everything you need is there all the engines all the compute everything’s bundled in one nice package and so one button click literally changing it from a pro user to a fabric user I think that’s
48:17 pro user to a fabric user I think that’s what it is a fabric user I don’t know fabric workspace I don’t know I don’t even know what it’s called right now but that one click of a button lights up all this new technology and gives you the availability to see all these things I think this is going to be really impactful impactful my counter argument to all this is for the business user is you now have just been opened up into a world where you’re going to be able to see a whole lot more data you’re gonna be able to share tables hold tables a whole lot easier so I think this is in general going to
48:49 so I think this is in general going to create a much larger data governance issue and [ __ ] oh yeah oh yeah like segmentation yeah well I and then and then if workspaces are really how you’re delineating Dev test prod or yes like things like that and it’s all still stored in one leg correct yeah like if you thought we had challenges around segmenting out different data sets and or yep certified data sets or gold data set like yeah your schemas and how you name things are
49:20 schemas and how you name things are accessible like can you what is it going to look like for an admin who has access to see all across oh totally races and even just even just so the traditional mindset this is what I’ve said to Microsoft a number of times now the traditional mindset of Dev test prods separated by physical Hardware separated by physical storage accounts is no longer a thing one Lake takes all of this and says no no no we’re not going to let you spin up three separate Hardwares instead you’re gonna land everything into one place and this is more of a platform as a service
49:51 this is more of a platform as a service as a space as opposed to infrastructure as a service so when you start moving to platform as a service you stop worrying about about spinning up multiple copies of stuff and you start thinking about how do you take one infrastructure and segment it into your different environments that’s a that’s a very big mind ship mine shift change for I. T organizations and they’re going to push hard against us because they don’t understand they do understand that they don’t want to do it I think is
50:22 that they don’t want to do it I think is it it’s a more political game at this point but but how that’s that that was my earlier point right in how how you comment keeps putting his hand up it doesn’t it doesn’t matter if they it doesn’t matter if they don’t want it to right like one like just it’s happening the insta gets put in place what are you gonna do you can’t shut it down you’re using fabric that’s what you’re using yes go ahead time so I oh man oh man jeez you guys I think from we’re talking about
50:54 I think from we’re talking about business users right now and I I really think there are I really hope there are some streamline UI features that we can customize if business users ever get involved generally speaking and I and again I I’m happy to be wrong and happy to be surprised but I would say that General users in sales marketing operations are probably not heavy python Jupiter users so that’s not
51:26 python Jupiter users so that’s not that’s in a sense one thing right off the bat that you really don’t want them to there’s so many options in fabric just in in one leg that if what we’re gonna set up for them we want it to be as streamlined as simple as possible here’s the data flow you already know what a data flow is now you can push it to the one Lake great don’t worry about pipelines don’t worry about Jupiter notebooks awesome I I think the other problem that we’re going to see too is if some of our
51:56 we’re going to see too is if some of our content is in one Lake and some of our contents still in our normal ETL systems where does that really directly go because direct Lake only works with obviously fabric if all your data is already stored in one leg correct no no it works it works primarily and the best in that use case if you have a Delta parquet table in other platforms that support Delta parquet my understanding is that direct
52:27 parquet my understanding is that direct like would work with them as well yeah you could even bring your own Azure data Lake Gen 2 so imagine you’ve built a whole bunch of Delta parquet tables somewhere else and you want to just expose them to power bi well you don’t have to go do a bunch of work my understanding is you can just create a shortcut to your Azure data Lake Gen 2 point to where the Delta tables are as a shortcut and power bi can directly into those things like what the heck like there’s potential here for years of working companies that have
52:57 years of working companies that have already been building these Delta tables to just be instantly leveraged here in a matter of Moments by connecting the lake and then directly there’s no import mode now like the so this is one of the comments that was made by Tristan who was on Twitter talking about he did like a very simple demo again very more testing required I’m not going to say this is the end-all be all but he was just loading a 10 million row file into directly and it was incredibly fast to get that data directly into Power bi that’s incredible like the speed of what
53:28 that’s incredible like the speed of what we’re going to talk about here directly because a big game changer because I don’t want to have to manage a loading screen for power bi I just want to update a Delta table and just have it immediately loaded into Power bi that’s huge huge that’ll make a big that’ll make a big change here for companies so I agree Tommy from the standpoint that is it information overload or capability overload from the standpoint that you’re you’re now giving every user
53:57 that you’re you’re now giving every user in the organization access to like these tools they have no idea how to use yeah does it does it mean they should just jump in and use them no I don’t think we’re suggesting that it’s like you’re instantly going to go into Enterprise tooling right sure what I what I need to test out yet is how what is the interaction going to be between I’m a straight business user and these things and how do they show up in one leg right if anything I would imagine it’s going to be more
54:29 imagine it’s going to be more discoverable and more reusable hopefully right and that is where a business user would start to like Leverage these systems more because and that’s why I’m saying it’s exciting from a from my perspective in a grow-up path because if I can start to interact with these objects or potentially replace them then then there’s a path in which bi teams and centralized bi teams can work with business users even easier and better than they did before what remains to be seen we’ll see how it
55:00 what remains to be seen we’ll see how it all all pans out I think a lot of this is predicated on the ability of like the one security model which isn’t in place yet right now correct like a lot of things are like for full Enterprise Integration would need to be implemented before you just like Carpe launch but I also think they offer more tool like capabilities and features when that’s in place as well so it’s like it’s like where are we at how can we use it but how how can we leverage it when it’s full ga
55:31 full ga that that is a challenge point to doctor and you bring up a really good point here so I thought I want to point out to the audiences there’s a major challenge at this point where people wouldn’t want to implement full one Lake on all their data sets immediately there’s this concept of row level security right especially in power bi how does the security of the power bi elements get filtered down to these direct Lake things and so one of the things that we didn’t talk as much about but we’ll probably have to touch on here very briefly at the end of this because in the title of the video is this
56:02 in the title of the video is this concept of directly so we’ve been talking a lot about what one Lake means collecting assets all the the under the hood Technologies but directly what directly does is it allows a power bi data set to directly read into memory basically skip the loading process of Delta tables or these Delta tables directly into memory so yeah I think the idea here is the tables are already optimized for vertipack vertapac doesn’t need to do a load and compress what happens now is the data set itself can
56:32 happens now is the data set itself can go directly to blob storage read the Delta files and immediately load into memory exactly what it needs to support the reporting now I also in my mind here there’s there’s definitely an opportunity for them if you think about Mega big Delta tables like you have Delta tables that are massively large if if they’re doing this already there has to be a way for them to intelligently pick which partitions to actually load into memory because the the way that Delta tables are formatted
57:03 the way that Delta tables are formatted there’s a just there’s a file that describes all the partitions of that file and so I’d have to imagine the query could come from Power bi desktop into the as engine the as engine can say oh look I’ve got to load these five tables and probably right now what they’re doing is they’re loading the entire table initially into memory I don’t know how you test that but they’re loading the entire thing into memory from Delta tables but imagine when they get smart enough to say hey this query came in I only need to load the last two years of data and so it can like you have 15 years of data in the Delta table
57:34 have 15 years of data in the Delta table and it’s only loading the partitions that are relating to the last two years because that’s what the query relates to or that’s what the report is asking for and then dynamically they could load more partitions as needed again further optimizing their need on memory to me this is like what in the world this is incredible technology that is is occurring here and I think this is a direct feature challenge to what we’re seeing with like Snowflake and other systems because those are all traditionally owned by it
58:05 traditionally owned by it and frankly the business doesn’t care about those other systems they want their data now and the way they want it so how do we how do we blend these like so there’s going to be a whole bunch of more like negotiating between who really owns the data when do we transition from one one person to another data catalog anything is going to be a major issue to direct like I think of all the features this one’s still the weakest at preview right like you look at I can’t use calculate column like it has a lot of same limitations that like live connection right when that came out right agreed no real level security is a
58:36 right agreed no real level security is a no no no yes like because I can’t I can’t use it right so it’s it it’s got stuff like that but on the flip side of that having having now a new connection type that is not direct query right direct query is like I’m interacting with something in the report and a query gets sent to a database it runs and then retrieves the information this is almost like a stateful it’s like a live connection it is like an analysis Services back end right
59:07 like an analysis Services back end right where where it’s very similar in the state that it’s like if you think about ETL and processes that are kicking off the amount of time that you’re saving right now if you’re using import with partitioning and incremental refresh right where you’re now have a big load to do like there could be a half hour an hour two hours to get your data to a place where somebody can consume it 100 in these states now it’s instantaneously available whenever it’s in the table whenever When anybody some when somebody
59:38 whenever When anybody some when somebody interacts with a report so huge Time Savings and actually having what like I say is a stateful almost like prod like table where it’s like my report is directly tied into this object where my data is and that is that’s very intriguing and like I really want to use it I also need real love of security yeah yeah yeah yeah so I I I’m I’m super excited super excited about like where this is going
60:08 excited about like where this is going but yeah definitely I would recommend check out the direct like you recommend check out the direct like overview of of what things are know overview of of what things are available right now and probably what what other things are coming when the additional security and things are added on top of it I like it so today’s quick chat GPT question I’m using the the Bing chat today so our question for today was how are power bi one Lake impact my business business Microsoft One link is your centralized data Lake that eliminates
60:38 centralized data Lake that eliminates data silos provides your unified storage system system it’s deep integration with popular applications like Excel Excel Microsoft teams PowerPoint and SharePoint relevant data from one lake is easily discoverable and accessible to all users in the Microsoft 365 environment I think that’s that’s we’ll see where this goes because right now everything I’m thinking right now with one lake is inside power bi power bi but there is a feature that I we didn’t talk about that I caught which was the
61:08 talk about that I caught which was the the one lake has an one Lake file explorer so you can download a one Lake file explorer and that I didn’t catch this earlier but that’s a feature that’s directly related to like Microsoft 365 accounting like that’s like one drive for your Lake now and to me again this creates another whole bunch of problems like if I anyways sorry side note on this one but like like if I upload an Excel file and it goes into the lake and I start consuming it in a in a pipeline what does that mean does does that if I update that file and
61:38 does does that if I update that file and add a column or delete something or change some data does that break all my Downstream everything yes it could it could totally do that and potentially this exposes more rigor around when you certify a data set when you certify something one of your things is you can’t be pulling data from success files or you can’t do it without a quality check or something there so there’s going to be more discussion around you going to be more discussion around how do you manage data quality with know how do you manage data quality with this new feature called one Lake because it will give you more access to getting data into the lake which is good but it’s also comes with some risks the
62:10 but it’s also comes with some risks the familiarity thing right like because it it for my like look it it’s going to look like Windows Explorer it’s like the same thing that you would normally do with folders and and and objects that your you as a business user are very familiar with so yes so what yeah you just replace the Excel file and boom yeah exactly everything just do that so I think there’s gonna have to be more talk and conversation around what does a certified data set look like and what is the like we’re now talking true data lineage from where the data came from
62:41 lineage from where the data came from until how it gets into the report because you’re going to want to control that I think at a better at a better degree when you start talking about certified stuff the last point it makes here is talks about one Lake data Hub in fabric allows you to discover data in the lake which that’s my other big Point here is making more Lake tables it’s gonna be discoverable and then you can discover data which you already have permissions for or which the owners have marked as discoverable which I think that’s another interesting thing too right because there should be a new habit of instead
63:12 there should be a new habit of instead of going out and finding and building your own product master table you should probably go back and say did someone else build this can I use it is there a certified table that’s producing this yeah right that’s where I should be starting but and we’re so over already but that is also one of the benefits of one like that’s exactly right because in in an I. T or Enterprise ecosystem or in the business how many how many of the same data sets do you have all over exactly and if I can eliminate all that storage or I can eliminate all those things that’s part of a story that it’s interesting that
63:43 of a story that it’s interesting that fabric isn’t leaning into or maybe they are and I just haven’t seen it but it’s like hey consolidation of data like you don’t have multiple versions of things everywhere it’s gonna be cool I really like where this is going I think there’s a lot more conversation coming on this one yeah I need to do a lot more testing and getting in and doing real world things here so thank you very much everyone for hanging on for those of you hung on the extra four minutes where over time we appreciate you very much thank you for listening I hope you got some interesting insights around what is one Lake what is direct Lake I’m pretty sure we’ll talk more about this in the future thank you very much for listening our
64:14 thank you very much for listening our only request if you found this valuable and you got some value out of this please recommend it to somebody else and let them know that this is an interesting podcast that talks about random things that you’re interested in Tommy where else can you find the podcast you can find the podcast anywhere it’s available Apple Spotify YouTube Google podcast make sure to subscribe leave a rating it helps us out a ton if you want to join us live you can do so every Tuesday and Thursday 7 30 a. m Central awesome thank you all very much and we’ll see you next time
Thank You
Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.
Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.
Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.
