Fabric Too Simple or Too Complex? – Ep. 392
In this episode, Mike and Tommy wrestle with a question they keep hearing: is Microsoft Fabric too complex or too simple, depending on who’s using it? They also dig into the ‘myth of the data catalog’ and why getting clear on definitions and outcomes matters more than buying another tool.
News & Announcements
This episode’s “beat from the street” kicks off with a common request: “we need a data catalog.” Mike and Tommy’s take is to slow down, define what you actually mean by data catalog, and align on the outcomes you’re trying to drive before you evaluate tools.
-
Power BI Theme Generator (Tips+) — A quick way to generate and manage Power BI theme JSON for consistent report branding. If your org is trying to standardize design across many reports, this is an easy win that reduces rework and keeps visuals aligned.
-
The Explicit Measures Podcast page — The home for the show, including all episodes and links to listen on major platforms. If you want to follow along live or browse recent topics, this is the best starting point.
-
Submit a mailbag topic — Got a question you want Mike and Tommy to debate on-air? Use the mailbag form to submit topics and real-world scenarios (the messy ones are usually the best episodes).
Main Discussion: Fabric Too Complex or Too Simple?
The core question is one Mike and Tommy keep hearing from teams adopting Fabric: does it feel like a simplified “one-stop shop” or an overwhelming platform with too many moving parts?
Their answer (unsurprisingly): it depends on the persona, the expectations, and where you start. Fabric can feel “too simple” to experienced engineers who want granular control, while also feeling “too complex” to analysts who just want to move data and build reports without learning a whole new stack.
Too simple for engineers (and why that’s not always a bad thing)
For seasoned data engineers, the desire is often to:
- control the tooling end-to-end
- tune performance at every layer
- enforce strict DevOps patterns
- pick best-of-breed components per workload
Fabric’s opinionated experience can feel constraining in that context. But Mike and Tommy point out that “simple” can be a feature when it reduces the time-to-value for an organization—especially when the real bottleneck isn’t the tech, it’s coordination across teams and environments.
Too complex for analysts (and how teams accidentally cause this)
For analysts and BI teams, complexity often sneaks in when:
- roles and responsibilities aren’t clearly defined
- a workspace becomes a dumping ground of artifacts
- the organization expects one person to be an analyst, engineer, admin, and governance lead
A major theme is that Fabric adoption frequently requires explicit decisions around who owns what (pipelines, lakehouses, semantic models, governance), otherwise the platform feels harder than it needs to be.
The best entry point is rarely “everything at once”
Rather than trying to “do Fabric” in a single jump, the recommendation is to anchor on a clear first use case (and a primary persona), then expand:
- Start with a Power BI-first scenario (semantic model + reporting) if your primary outcome is analytics.
- Start with ingestion/orchestration patterns if the pain is data movement and standardization.
- Add governance and cataloging capabilities after you’ve proven the foundational workflows.
The point: if you’re feeling whiplash between “this is easy” and “this is impossible,” you’re probably trying to make one platform serve multiple personas without a deliberate operating model.
Looking Forward
As Fabric matures, the tooling will keep improving—but Mike and Tommy’s bigger prediction is that success will come from teams getting serious about the fundamentals: clear ownership, a defined path from raw data to curated assets, and an intentional approach to governance that matches the organization’s maturity.
Episode Transcript
0:35 good morning and welcome back to the explicit measures podcast with Tommy and Mike hello everyone welcome back good morning Mike good morning peoples good morning internet welcome back to our podcast this is a recorded episode it’ll be very clear we had some scheduling things happening as always things people people people got to do things go places and we can’t always do all of them live so this one is a pre-recorded episode so you’re more than welcome to help chat in the chat window ask questions and things but just be aware Tommy and I probably won’t be monitoring the chat as closely today
1:05 monitoring the chat as closely today because we’ll be doing other things I’ll be in an indoor water park for my daughter’s birthday so oh okay that’s a great excuse so I’m enjoying the podcast but we’re gonna have some fun Wisconsin all right yeah exactly you don’t get to do that really anywhere else Wisconsin’s everything indoor especially around now it’s it’s cold right now dude we got the snow and I think Wisconsin definitely realizes well you got like a week and a half of good pool outdoor pool so we’re going to build a bunch of indoor ones it’s true very true let’s
1:36 indoor ones it’s true very true let’s go into our main topic today so just very quickly here this is a mailbag there’s there’s going to be a topic around here just around fabric right we’re trying to as more and more things continue to roll out into Fabric and we as we continue to see features get refined and get added into the fabric ecosystem there’s probably some questions around what roles or what people’s roles need to be available inside Fabric and so really the topic today is around is Microsoft fabric 2
2:06 today is around is Microsoft fabric 2 complex or is it too simple right too complex for engineers or sorry too simple for engineers but too complex for data analyst right or analysts that you would look at so that’s going to really be our main topic for today but before we get into that Tommy do you have any news or any beat from the streets I do and this is around data cataloges or I want to say the myth of a data catalog so okay I don’t know if you get a lot of requests and I’m I’m getting a ton of requests about hey can we build a data
2:36 requests about hey can we build a data catalog we want a data catalog and one thing I’ve been asking those when before we start those projects is can you define for me what you think a data catalog is that’s a great question to ask because the thing is if you were to actually Google this look in Wikipedia ask your AI systems there’s a thousand ways to slice and dice it and it can be incredibly complex for companies or organizations who are still at maturity level of 100 where they’re talking like well we need metadata and we need you well we need metadata and we need Source system and lineage it’s like
3:07 know Source system and lineage it’s like well you’re still dealing with things in Excel so the thing that I’ve been realizing is a lot of people want the myth of the data catalog and I don’t mic if you’re seeing this but I’m seeing a ton where people want this idea this concept of we know where all our data is coming from we know how it gets engineered or processed and so we know exactly what and I I like to use the member count example but we like to know how member counts are calculated it we can easily look at it like we were to
3:37 can easily look at it like we were to look at like a recipe and that sounds great the idea of that is great that everything is nicely cataloged everything is nicely documented but Mike I almost feel like this is a myth for most organizations oh man this is a really loaded question so the question so couple things around your comment one one comment you made there was mat level 100 I believe you’re referring to so for people who aren’t following along or haven’t been with the podcast for very long a maturity level 100 refers to
4:07 very long a maturity level 100 refers to the powerbi adoption road map and there are different levels of maturity one of them in your data space is a level 100 so it goes like level 100 all the way to level 500 roughly you can categorize yourself level 100 is like you’re just brand new like you haven’t really done too much rigor around data things and then typically in these level areas a 500 level is you have processes involved you are regularly reviewing it you have a team that understands
4:37 you have a team that understands that the requirements and the specifications you have a bunch of things that are in place you are regularly winning at that particular area and I think to your point here one one point is Tomy how are we getting the data catalog the first thing that comes to mind here is do you want your data catalog to be actively maintained or passively maintained that’s another great Point here so the reason I bring this up is this particular point is you can build a data catalog and I’ve had I’ve helped people build a data catalog of sorts inside SharePoint hey
5:08 of sorts inside SharePoint hey here’s our documentation here’s some of our models we’re building a data model and we’re going to take a couple screenshots of it and we’re going to put it here all the measures get outputed into a a list of items that you can just put them here’s the descriptions of them we now have info. viw functions so if you make a model you can document self doent columns tables and measures and those can just be there but it requires someone to go in and actually write an an accurate ample description of all the measures all the
5:39 description of all the measures all the columns all the tables so I I guess my point here is if you wanted to document every single thing you have Excel sheets all the way to certified or very widely used data sets I think you’re going to be spending a lot of time just having people write things down and not a lot of time building insights or reports or things that are actionable that you need to go do your business with so I think it’s a balance right of the of the information that you have in your company that is the most
6:09 have in your company that is the most important that needs to be documented and put somewhere so I thought you were going to go a different way when you said the actively maintained thing here because one thing I I mentioned to finish my point here before you you jump on so that that part I think I think that’s where I would again I always focus on like what is certified let document those things let me go back to the active versus passively monitoring because I think that’s where you’re going to go with this next which is if it’s active that means every time a change occurs I need to involve a person
6:41 change occurs I need to involve a person to go right down the updated metric or whatever those things are right also with that that usually means there’s not a lot of let’s call it if I throw content out in there into the into the company there’s not a lot of opportunity for other people in the company to come back and say well this measure worked for me but it didn’t really work here or this measure works when I use these Dimensions but not those Dimensions or this measure is used in this metric and it’s also used in these reports right so there’s there’s other
7:11 reports right so there’s there’s other people could potentially use these things and find out where these measures are being used across to different parts of the organization or in different models and reports but it’s the active part it’s the one person one person is writing it documenting it putting it someplace and it becomes static it’s not a living piece of document document I think the data catalog is more passively monitored and act it’s like what by that is whatever you do document it in the models and then have the models like automatically scanned
7:41 the models like automatically scanned and collecting the measures and the things together is there lineage to stuff yes there is well the system itself knows what the lineage is can we just self-discover the lineage can we just figure out where everything came from and automatically in a passive way right I don’t I’m not actively writing the documentation the tool itself provides its own level of documentation so that’s what by active and passive yeah is that in case you were going to that that’s that’s really really close and one of the things that I I said it’s like we can
8:11 things that I I said it’s like we can build a data catalog but if you want a data catalog that you and the organization are satisfied with then there are other things we need to do to your point Mike if the lineage if it automatically happens there’s a lot of assumptions that things are already built in a certain way right that doesn’t just happen and one of the things is a lot of people want the data catalog like and very much that cliche The Carriage before the horse so to speak where they want this idea of the concept of data catalog but they don’t
8:41 concept of data catalog but they don’t have the data governance in place they don’t have those processes in place because the technolog is not going to just magically do it if you don’t have the lineage in the data flowing a certain way if I have things coming from SharePoint and Excel and manual efforts that cannot be seen from the platform so how you going to ever document that how would you ever know like those are things you’re not going to know you’re not going to know what people do after like you’ll know up and two the person who does the massive export to excel yeah but your process basically F so
9:12 yeah but your process basically F so your point there though is is it is it more important to have the data catalog or if you reverse that a little bit and say well actually it’s not important to have the data catalog what it’s actually important to do is actually have the the process in place 100% to get the information out of people’s heads into again this is this is where I think the certified barrier is a good place to draw the line you could document everything I think that’s I think to your point Tommy like if we can get more passive things to scan everything in the whole system like semantic link Labs does a great job of this you can point
9:42 does a great job of this you can point to semantic link Labs at a workspace and scan all the lineage of the reports to to semantic models you can get all the semantic model details and information out of them but if people don’t spend the time initially in the process of documenting The Columns and the measures in the models they’re actually building
9:59 in the models they’re actually building then what good is it to you like you can look at the out the output of like let’s just to your point Tommy what defines a a subscriber right right what does what does that look like in your organization is that someone who showed up one time and subscribed and now no no longer is paying does that mean this is someone who’s paying for something who’s regularly showing not like what does that look like there’s a whole bunch of potential definitions by Department of what that means and you can scan the models and say this is the calculation this is the measure that defines that
10:30 this is the measure that defines that that calculation and you can scan that across all your santic models and you can see it everywhere in your organization where that is worded thing or where that column’s being taken care of the my point here is just because you have that information can you one can you do you have a system in place that can go get it most people most organizations do not and then two what do you do with that information like if I have seven definitions of the same measure or what we think is the same measure how do you how do you go back to a central either body or governing or
11:01 a central either body or governing or how does the organization align on what is this definition well let me do you one better with that because I love that point you may actually have a quality data catalog as of January 15 2025 but without the to your point without a process or some automation but more importantly to me accountability and ownership all of a sudden a month later there’s weeds on it because you later there’s weeds on it because why the way the data gets in know why the way the data gets in changes so yeah we can you can build a data catalog oh I like the weeds part
11:32 data catalog oh I like the weeds part yeah but the problem is it’s ever involving like that member count probably there’s some logic that’s going to change it’s like well we now incorporate this we bought this company or whatever the whatever the situation is yeah it updates and changes so even if you build a data catalog you say we have our definitions and maybe you painstakingly went through everything maybe we’re like we’re not going to do the automation well Mike a month or two later or let’s say six months if that’s not actively updated well my Pro my friend we’re going to have some problems because we’re going
12:02 have some problems because we’re going to be outdated agreed all right I think this is a good beat from the street and then we’ve spent a lot of time just talking around this topic I don’t know if there’s any other main points here again if I’m if I’m going through this this topic here I think this is very needed I think I would also point at in order to have a good data catalog you need to have some process in place or you need to start with you to start with the process say 100% 100% we’re going to
12:33 process say 100% 100% we’re going to build a data catalog let’s start with what can we do today what do we have in place right now for our process that helps everyone who’s building semantic models and this gets harder the larger your organization is because every model builder who’s building certified things need to do this and I think this is one of the when you read the parbi adoption road map that’s one of the major tasks they have in doing that when you build the par adoption road map you look at going hey I need to make sure that I have the ability to have a a gate to
13:06 have the ability to have a a gate to say some data set is certified one of the requirements to gate something into a certification because I can limit the amount of people that can actually say this is approved to be certified one of the one of the features n gate is have you documented the model do you have the measures and the columns documented so people know what you’re doing that’s to me that’s a nice clear clean way to measure that and then now you can say you can actually run apis to go say grab all of my semantic models of the ones that are certified how many columns are
13:37 that are certified how many columns are documented not col not documented this now becomes a kpi that you use to talk to the organization like how good are we at our process so put the process in place make sure it’s part of your rigor measure it and then do things about it that way when to speed up hey we’re at 99% completeness great keep on building people you’re doing you’re following the process oh no we’re at 50% or 75% completeness that’s not acceptable for certified things let’s
14:08 acceptable for certified things let’s slow down the development of new stuff and step back and find out where is the places that we need to document and add more information to help people use things yeah the whole goal is here is to get people to use more of the powerbi right I I love that and that’s all that’s really the biggest part I think I have a topic here I’m going to put in on our little board for future episodes but have you been using at all with speaking of definitions I have this great Macro for and I think I don’t remember who actually developed it but I
14:38 actually developed it but I modified it it basically I choose my measures I asked chat gbt 40 to create the definition and it has this nice business definition with the measure formula underneath it using that new me the Dax the info the Dax dotinfo or I basically have this table with all the definitions for my measures it’s a great quick way to get definitions and I have to admit I have to admit those definitions from chat PT have been solid solid so two observations around that
15:10 solid so two observations around that one is to your point Tommy if you can give chat GPT enough context to understand it’s a it’s good at understanding like the formula how it works what it’s trying to calculate and provide like that business translation and again it’s not going to be perfect all the time but at least it gives you a starting point better than nothing better than nothing so let’s give me some place to re so another thing to I’ve heard people communicate to me is individuals are more likely to react to something that’s wrong than to than to give you something from scratch o so let me say that one more time yeah people
15:41 me say that one more time yeah people people are more willing to give you input or feedback on something that’s wrong versus building something from scratch and I think I think the reason is I think this is true like it’s hard to get started over like well what does this measure mean I have to understand it if you can at least get some semblance of an from chat jpt and then you can refine it you still need someone there to make sure it’s right and it’s accurate and actually makes sense for what the users will need yeah but at least you have a starting point you have a place to react from I’m doing with my
16:11 a place to react from I’m doing with my daughter’s homework I’m doing this more and more and my wife asked was asking me the other day was hey I need to write this email about this topic or whatever I said well throw in the chat GPT tell it like what tone you want to use what’s the main point you want to use I’m good at writing bulleted lists I’m not good at writing a lot of fluff so I use chat GPT to to help me like add some more context around things and hey look I need an email going to this person with this tone and these are the bullet points I want to convey write this in a two paragraph sentence or two paragraph statement whatever
16:41 paragraph statement whatever does a good job of that but that part I really like about it love it okay anyways really interesting thoughts around data cataloges I think we could probably do more topics around people and the process part of this maybe Tommy we go dig through the the powerbi adoption Road map and go find some topics around that actually it’s not powerbi anymore it’s like the fabric adoption road map and figure out what we can do there to help people document their stuff okay well that being said let’s go over to our main topic today so
17:11 let’s go over to our main topic today so our main topic today is a mailbag we have a question that has come in and Tommy since you we have we’re down to two now you’re going to be our mailbag reader so go ahead Tommy take it away give us your best mailbag voice well and we’ll say mailbag reader for now so we’ll see how it goes this is from and I’m going to already botch the name so I apologize but shahim MTI Greg dear’s recent LinkedIn post about fabric not being adopted by businesses due to it being too
17:41 businesses due to it being too simplistic for data engineers and too complex for data analyst in his opinion and he shares a post by Greg and also Sam Sam McCay about powerbi maybe should not be part of fabric I think this would make an interesting debate I have the opinion that fabricc is opening the way for data analysts to become analytics Engineers but from an upskilling perspective I’m I’m uncertain about how it fears against the likes of data
18:12 it fears against the likes of data bricks and snowflakes in terms of being the tool of choice for orgs to looking to improve their data ecosystem and I just want to give that a little context and I’m just going to read a blurb from Sam McKay and about basically what he’s really sharing here and really give EXT for our conversation today powerbi saw so much potential in democratizing analytics now that Vision feel like it’s not necessarily the same I feel like we’re stuck between choosing five
18:42 we’re stuck between choosing five different data storage options and navigating layers of unnecessary complexity powerbi should not be in fabric so these are some very hot topics and again it’s not shahim is looking at these different posts and saying what do you guys think but we have these two different posts from people who’ve been in the game for a while that are really voicing two I think two things Mike one power fabric is too complex for data Engineers who’ve been in the game but too or too simple for
19:12 in the game but too or too simple for data Engineers who’ve been in the game but way too complex for your normal data analyst and also this idea of well maybe powerbi shouldn’t be part of fabric it it is his own entity so I think without without being too extravagant there’s a lot lot to unpack here yeah I I think there’s a lot of I here yeah I I think there’s a lot of there is I I feel like there was a mean there is I I feel like there was a wave of maybe about criticism around Fabric and the ability and and the need for Microsoft to
19:43 and the need for Microsoft to why why are these two things coming together why do they why are they being joined powerbi was great by itself why don’t we just keep investing in powerbi pieces and I think this is a a fair question to ask especially now that you’ve seen the tool for a number of years and been able to
19:58 of years and been able to unpack a little bit of what the tool is doing here I I will I think I will disagree s very clearly with these two opinions I I do understand I think they’re points I think I understand why they’re struggling to figure out what is the adoption of fabric how is this working where where does this get placed inside a library of tools and and things for the the business team to to use and build on top of so
20:30 to use and build on top of so my I’ll give you very quickly here I think I think fabric is if I had to step back and give you like the the overall summary what is fabric happen what’s happening to fabric right now there’s a lot of Microsoft data products the data platform has a lot of data products in it fabric is the collection of all of those data products into a single ecosystem that’s what that is everything that we’re seeing right now we have U custo DBS which is added to the database system we have now SQL databases coming from the Azure
21:02 SQL databases coming from the Azure space as well we have pipelines which was Azure data Factory so I I think if I had to compare like there’s a progression that got us to fabric the progression was build things in powerbi realize hey we need better data engineering tools and we need to bring the data closer to where the powerbi things are so I really like direct Lake I think that’s a very strong feature I think people should use it more and I’m even finding small to medium-sized businesses immense amount of value for this tool yeah already
21:35 of value for this tool yeah already and again I think these points are very valid I think the the point here in the in the article is talking about when you compare fabric against the incumbents right Snowflake and data bricks they’ve had many more years to develop and build out things that are much more robust M the tools have been developing they’ve been having customer feedback a lot longer than Microsoft has I think you’re going to always kind I think you’re going to always come up to Microsoft just trying to of come up to Microsoft just trying to catch up there there’s a an ability for
22:05 catch up there there’s a an ability for them to get to a place at some point in time you’re going to start seeing the comp competition being very equal and you might even see the scales tipping more towards the fabric space and this is I think I would say I called this one right when fabric was initially released in GA I said wait it out you’re it’s going to take about a year for things to really start getting good and you’re starting to see really huge value being produced from this we’re now just rolling into the new year now we’re almost at a full Year’s cycle of fabric
22:37 almost at a full Year’s cycle of fabric Bean GA we’re a year plus now that happened in in November yeah year a little bit now yeah okay so we’re at a little bit more than a year now and I to be honest there was pieces of mean I to be honest there was pieces of this that were in the beginning very friction filled I couldn’t do things I couldn’t get data moved around the way I wanted to there was too many limitations but now we’re seeing all those friction points being removed and it’s becoming easier and easier to utilize the tool and things that I was Finding
23:08 the tool and things that I was Finding easier in other tools such as like Delta live tables and data bricks there is now mirroring that’s starting to appear inside fabric so it’s not the same feature it’s Microsoft’s flavor on how they want to do it but there’s a lot of similar features going across here so that’s my initial reaction that’s my initial reaction of like Fabric in general this is a really good debate though where do you land on this topic Tommy where do you feel like you fit so I I am gonna play devil’s advocate for the conversation because there’s a lot here that I I can totally
23:38 there’s a lot here that I I can totally understand but I want to preface that with man it takes something when I’m diving into something or or 100% I got to be excited for this and a lot of the things fabric has done especially in the last few months I’ve been excited for because I’m like I wish I had that like six years ago man I I wish I had that four years ago this would have been great this would have helped this situation or this project but let’s let’s rewind a bit and I think this is really important to do let’s go 2015 really to 2023 right where fabric was
24:10 really to 2023 right where fabric was right before fabric was released or even in preview most organizations and companies could survive and thrive with just a powerbi shop from if you were to say business intelligence in an organization it could be just powerbi it did not have to be all these other data products and it worked out really well you could have a maturity of 500 I’m sure there was some data engineering team the organization must have had but for the most part you could survive and
24:40 for the most part you could survive and thrive with just powerbi powerbi is not diminished in its feature set it’s not like we’ve taken powerbi and it’s no longer the same product it’s that and more if you would to separate it from fabric right now and I think that’s a very important distinction to make right now yeah yes we have all these things in fabric yes fabric is powerbi and data engineering and data Factory and SQL database is great but it’s not like we lost anything with powerbi so one could
25:10 lost anything with powerbi so one could make the argument and I feel like you can make the argument pretty well if I wanted to as a powerbi shop as a bi shop in an organization or even as a consultant I that one I’m a little weary on but let’s say more internally if I wanted just to do adoption powerbi Style Style I can and I could and it can work and I think that’s a really important distinction to make here I guess the yes I I agree with that one but I I’ll also argue here a lot of
25:42 one but I I’ll also argue here a lot of the like what what other Alternatives do you have right so sure okay yeah like my my my question here is there’s there’s going to be there’s always going to be this need for Lake housing things right so to me when I look at the the spectrum of like where powerbi came from and where it’s at now initially a lot of the powerbi architectures were let’s spin up a SQL Server let’s put some data there and then we’ll consume the data from the SQL server and that was a lot of a very common pattern hey I have a SharePoint
26:12 common pattern hey I have a SharePoint list hey I have Excel sheet or a CSV file those are the things I was traditionally consuming from structured data now I think we’re getting into a world and I I said this for a number of years now like we’re not even there yet but we’re getting much much closer every single application you build needs to have an API there’s going to be a whole bunch of semi-structured Json structured data that’s going to start showing up to you your environment for reporting how do you shape that into good tables and things well there’s a lot of
26:42 and things well there’s a lot of existing tooling that works really well with this python notebooks spark compute like there’s other spaces that we’re doing really well in this so you as Microsoft you can do one of two things you could just sit on your laurels and say well we’re just going to continue to optimize power query we’re going to make Power query even better keep adding features to that one or you can say look this this method of what people are doing in the data lake house experiences right snowflake runs on parket files it’s on an iceberg format
27:15 parket files it’s on an iceberg format data bricks runs on Delta format these Open Standards are getting are showing up the Spark engine is getting gaining in popularity and can compute more data with less cost this is becoming a race to the bottom how can I do all the data compute with less price how can I do it faster so all of these things that you’re seeing in fabric Microsoft had to make a decision look we tried synapse synaps was this attempt to build you synaps was this attempt to build they try to keep synapse and know they try to keep synapse and powerbi separate in them doing that
27:45 powerbi separate in them doing that synapse didn’t really go very well it didn’t get the features we needed I I don’t think they’re adding any new features to it they’re still supporting it but synapse in my opinion was like a flop it was a great idea Windows 10 now yeah it’s like Windows or whatever I don’t know what Windows 10 they officially said yeah we’re not supporting it this year so yeah but it’s it’s one it’s it’s the idea of like okay it it works we’re going to continue maintaining it but like am I building new architectures with synaps probably not I’m going to steer as many people as I can towards the the fabric space I
28:16 I can towards the the fabric space I know they have the Azure data Factory really robust super wellb built but they started rebuilding pipelines so maybe some of these articles and what Greg and Sam are pointing at here as a bit of hey you already had really good products Microsoft and they were already well equipped for those users in that area did engineering liked fabric as your data Factory it did what it needed to do if we needed lake houses we could spin up a blob storage and push the data down there right that was fine but doesn’t
28:47 there right that was fine but doesn’t need to be part or so closely coupled with powerbi I think maybe they’re arguing that they just want powerbi to stand alone to be clear here you don’t need fabric to run powerbi right I I I think this is one of the the points in these articles that they’re missing is you’re just you’re just complaining about fabric not being like next to next to powerbi okay then don’t use it like go do go do your stuff somewhere else like I use data bricks with parbi all the time not a problem but if you want optimized costs I’m not sure you’re
29:19 optimized costs I’m not sure you’re you’re going to make that decision if you have very large data and you have very skilled teams in the data brick space yeah do it in data bricks I don’t have any problem with that cost as side I think there’s a disruption here or disruption here with teams and P and the people actually using these tools that’s to me if I were to put my shoes or put my feet in their shoes on where they’re coming from put your shoes in their feet yeah I put I put one pants at a time so but no but I think
29:51 time so but no but I think the disruption here is with teams and and groups who are now saying you’re in
29:55 and groups who are now saying you’re in the fabric world and there’s two points I have here and and I I’ll touch in one each of them briefly and see what you think one if you were to take people now saying you’re diving into fabric you were powerbi your heart was know you were powerbi your heart was powerbi and I think both of us have that you much more have the Le you were Lila on between the data engineering world and powerbi but for a lot of people like okay you’re GNA do data engineering well that doesn’t necessarily Jive if you were all Sensei fabric went away and now we’re going to
30:26 fabric went away and now we’re going to have to do synaps data lake houses and we’re going to have to do everything in Azure you couldn’t necessarily make that jump it would be very frustrating to do that because yeah the skills are the same but it’s a very different it’s a much different process so you’re yes you’re doing data engineering but really not in the traditional sense and I think that’s one of the issues here and the other issue is actually let me stop there so I think the jump between people trying to say well I do data engineering now because I
30:57 well I do data engineering now because I do fabric do fabric can be very misleading so I I’ll pause there before I go to the next point I there before I go to the next point how do you I guess I would go back mean how do you I guess I would go back to your question that we talked about earli about cataloges yeah Define for me Define for me the data engineer what do they do point right let’s let’s let’s define let’s define what that means right if I’m going to just throw out a definition this may not be right but I’ll just throw out something here right a data engineer is building a process to take data in one form typically a raw form room it into usable data or usable
31:30 form room it into usable data or usable forms doing things like performing D duplication doing things like aggregations of data it they’re they’re building a process that the end the output of your data is ready to be consumed by another team or other people right if that’s what we’re doing right I’d argue that you could def if that’s your definition of what a data engineer is you potentially could use that definition across people who work in Excel people who work in power beyond in power query you could Define
32:00 in power query you could Define that I’m if if your definition is loose like that that like let’s let’s ignore and again this is maybe a bad ignoring there but let’s ignore the training and the Skilling and like the the the it side of things yes those people like if you if you build something you’re data engineering the process now if you want to have that data engineer be a good data engineer they should have some sort data engineer they should have some like pipeline process so I of like pipeline process so I can move things through pipeline I need to be able to monitor this thing so it
32:30 to be able to monitor this thing so it works well I need to have data quality like these are other aspects of this role that need to be there for to be a good data engineer so I’d argue if you’re an analyst shaping data and bringing it to someone and giving them an analysis you’re doing some level of data engineering you may not have all the skills that a traditional data engineer does and then also I would argue a like maybe more the traditional data engineering you’re going to look at the resumés you’re going to see things like knows how to work with with like big data knows how to work in Python
33:00 big data knows how to work in Python knows how to write spark SQL like there’s others there’s like very technical code languages that are usually required at the data engineer level so that’s what I wanted to get to so I know and and so like regardless people understand I need to change the data shape regardless doesn’t matter who you talk to data Engineers or business analysts they’re going to have to change the data shape to some level to get it into something that’s usable whether or not I use a power query or whether or not I use a data flow whether or not I
33:31 not I use a data flow whether or not I use a a pipeline or even notebooks right these are different tools at our disposal to be using these things in front of other people do I need to have Delta live tables from data bricks in order to be a data engineer like so the qualification is like if you’re saying again what I feel like some of these articles are pointing out here is just stick with data bricks it works just fine okay well what is what is it in data bricks that is that is making you more data engineering than
34:01 is making you more data engineering than other PL other tools well the answer is that tool offers you different features that are making it easier for you to be a data engineer they’re not adding characteristics of the data engineer that makes you a better data engineer just the tooling is different so you can build updating pipelines you can build refreshing of things right those things come with fabric already and you can do a lot of these things it may not be as easy as data bricks it may not have this the seamlessness as a Delta live table you may have to use
34:32 Delta live table you may have to use mirroring instead and see how that compares but it might still get to you the same result so all the words you’re saying is part of it sounds like I created a lak house look Mom an engineer thing to to an extent but but I think here’s the here’s an important aspect of all the things you’re saying so we’re throwing around some technical terms but we’re only talking about at the same at this point in time you and I have only spoke about one aspect one product in Fabric and this is where I
35:02 product in Fabric and this is where I think the the confusion comes with a lot of people right we haven’t talked about data science we haven’t talked about data warehousing and the the the rabbit holes those go down we’re only talking engineering we Haven talked about realtime analytics which is another big aspect of fabric as well so so the confusion is I could say in the past I was a powerbi pro with trepidation because again we knew all the tentacles that are power saying you’re a powerbi pro even Mike our first year we did the podcast you were not a
35:33 year we did the podcast you were not a fan of that term more because of Licensing but because are you powerbi Pro in adoption are you powerbi Pro in report creation are you power so and this was just a single product this whole time we’re talking we’re only talking about whether or not you’re an engineer with fabric but how can you say you’re a pro in Fabric and without saying that you’re an expert in all the aspects of fabricc which I think we’ve all said is impossible
36:04 all said is impossible the terms it’s not impossible I think again a lot of this was they threw a lot of different tools even even now right if you go back let’s just Let’s ignore fabric tell me you’re data tell me you’re a data engineer data Engineers are going to have certain classifications of like what they’re good in already like even if you just went to Azure like go to Azure go back to Azure and just say like Let’s ignore fabric the entirety of it and just say let’s just talk about what how to do I would argue a lot of people would know how to do in the data engineering space they’ll know how to
36:34 engineering space they’ll know how to work with aure data Factory they may even know how to work with data bricks but just because you work with data bricks and Fa and and Azure data Factory pipelines it doesn’t mean you’re a real-time analytics engineer that’s a that’s a specific that’s a that’s a narrowed area of topic around like that particular use case for that person so I would argue like it the we know that fabric encom encompasses a lot of things are we experts in all those things of fabric
37:04 experts in all those things of fabric probably not but as Time Marches On we have time to get projects and exposure and learn things about these other aspects that they’re bringing to our experience inside fabric so yeah you may need if you’re building something in Fabric and you want the idea of having someone becoming the expert in real-time analytics well you’re going to go find it an engineer who has prior experience or is willing to understand and learn realtime analytic Solutions and they can bring that knowledge to you
37:35 and they can bring that knowledge to you you’re not going to go just hire a random hey are you a data engineer great let’s hire you oh and by the way we have all this real-time analytics stuff you got to learn Kafka you got to learn you got to learn Kafka you got to learn spark you got to learn all these know spark you got to learn all these other things and they’ve never exen it before right so there’s some level of if that interest if that information interests you go get some training get certified in it go go out and get some classes go do the datab bricks classes there’s a ton of free information on the internet and or paid information where you can go get certifications on areas to get you
38:06 certifications on areas to get you familiar with the technology so you can make better decisions around it I don’t think I don’t I don’t want to limit the ability of saying I’m a fabric engineer based on do every little single thing of fabric because I don’t think that’s going to be possible I think you’re going to have specialists in kind you’re going to have specialists in areas as well potentially so I think of areas as well potentially so I think I have a way to shed a different light on this and I’m going to ask you to hire two people and I want to get your take on what you’re looking for okay so list
38:37 on what you’re looking for okay so list what am I building so I’m leading up to that so the first one we’re H we’re creating and I’m GNA I’m G to be a little general here for the sake of argument we’re building a business intelligence team in 2025 okay our tools powerbi what are you looking for in that their skills and are you looking for Fabric and what they currently do at all great question I would in order for me to answer that question I’d actually have to go back and look at what is my team’s existing skills and what are we
39:08 team’s existing skills and what are we what what is our infrastructure currently looking like and built today right this will change depending on what our team currently knows if I have team members already today that are supporting the bi and we already have things that are in Azure we already have things that are in the cloud maybe we have lake houses maybe I could already have a lot of this infrastructure already existing somewhere else in my organization if I already have those things I’m going to be looking for like if I if I have that skill set on my team I may not need another person who knows more things about fabric however I need someone with
39:39 about fabric however I need someone with enough familiarity to say oh I can shortcut to these things or I can write code that can make shortcuts automatically so we can utilize our existing data Assets Now flip side this right if my company is less mature in
39:53 right if my company is less mature in the data platform in Azure space we have SE servers on Prem we have we’re smaller data sets we don’t have as much volume of data we all of our systems live inside SQL and we don’t have any apis that are external to complicated data engineering things okay I could probably get away with a lot more Pro and premium per user licenses and use just them as that user so I really you’re you’re asking a question trying to get to bait me into a a certain answer no no no because and I think I
40:24 answer no no no because and I think I think my answer really depends on like what is my organizational structure and what skills do we have or are we lacking right if we’re on Prem and we have a lot of ser SQL servers there that we’re trying to move to the cloud I’m for sure going to try and look for someone who has more Cloud experience and more fabric experience because I’m going to want to migrate those other artifacts out of on Prem at this point in it’s 2025 I don’t want to be in the business of managing an infrastructure solution I don’t want to manage servers I don’t want to have to go in and apply updates
40:55 want to have to go in and apply updates and fix things there’s just so much work with that and i’ I’d rather spend our team’s time on let’s just focus on getting the data there and it’s always available and let’s shape the data into actionable insights let’s work on optimizing the most efficient way to get the data from where it is where it needs to be for users and there’s the Crux of it because I G I’ll give you my answer just so let I was not trying to bait you this was really just to get your feel your take on this because yeah when I thought of this my first thought was like well no you could just do
41:25 was like well no you could just do powerbi as I thought about this and I was like I did not like my answer that I had for myself because honestly if I was hiring a team right now or let’s say I was gonna bring someone on a PBI I’m looking even if it’s internal yeah I’m gonna focus on powerbi but I’m gonna hope they have some fabric experience or I’m gonna plan or I want them to at least want to learn about it right if nothing else I’m gonna give them time to learn on their own but the distinction yeah I need them to have the
41:55 distinction yeah I need them to have the propensity to go like hey right in the interview I’m looking for things like hey I love powerbi I’ve been doing it for a number of years it’s great I’ve been doing I’ve been dabbling in fabric a little bit here and there I’m not an expert at it but I’ve been adding some things and I’m finding it really interesting I can see the value of it okay great you would be more likely a candidate for me to hire because you you’ve dabbled in it you understand the surface area of it you may not be an expert in all of the things of it but at least I know you’re willing to learn that part of it as well and this
42:25 learn that part of it as well and this is the big distinction here I think a lot of us including ourselves and I think a lot in this post are getting wrapped up in the role of you’re a data engineer you’re a report Builder you’re a data scientist no because honestly I’m looking for the most efficient ways to do something in data that’s what I’ve Loved about powerbi and guess what I may not have a full-fledged fabric ecosystem set up where we have everything’s a lake house and everything’s a direct Lake but there are a lot of tools that I can use
42:57 there are a lot of tools that I can use and fabric that to me are an extension of powerbi where I can still live in my powerbi world but what it’s really nice to have this mirroring system or what rather than connecting directly to this API we’re actually storing this data in a database or a Lakehouse that does not require me to be a data engineer that does not require my organization to shift all their skills yeah there’s a little upskilling there but that doesn’t make me an engineer so to speak that just makes me more part of the business intelligence side just because I create
43:27 intelligence side just because I create a lake house and my marketing team all their apis are now storing the data rather than directly connecting to it in a powerbi semantic model that runs 18 hours a day well now I can store this data so there’s more efficient ways to do something but I don’t I think a lot of people are getting that solution or that approach confused with well no you’re shifting you’ll never do powerbi again you’re now a data engineer and that’s to wrap that up if I’m hiring someone right
43:58 wrap that up if I’m hiring someone right now I want them to know the most efficient ways in the powerbi data platform to do things yeah I think you’re right that’s going to be a a continued growth part of this right so I think one thing that I find is very difficult right now in fabric is just there’s so much stuff in preview there’s a lot of things that are happening there’s a lot of new features that are coming out I can see a lot of things on the horizon of becoming fully-fledged featur making it easier for me to load
44:28 featur making it easier for me to load data in like it’s in general the whole experience of fabric is just getting better especially over this last year if you if you would ask me when po fabric was initially announced as GA what would I have said to a new customer I would have said what let’s let’s wait a little bit let’s wait 3 to six months let’s see how it goes let’s see how if they can refine it a bit more here we are a year later and I’m saying for me the mile marker if I had a spectrum that said on one end I’m talking about data bricks existing data
44:58 talking about data bricks existing data platform Solutions is all built in Azure and the other end of the spectrum I’m looking at just solely fabric I think the the line where I would start was heavily towards data breaks in the beginning of this but as fabric keeps adding new features it gets better getting better reporting I’m getting more comfortable with it right I’m getting closer the Mark is moving farther and farther towards I’m probably just going to start with fabric now and at some point the solution will be so compelling that it will be I only want to use fabric I don’t need to be using
45:28 to use fabric I don’t need to be using other tools in only special situations do I need to and I think data bricks and snowflake are actually very aware of this I think they’re very aware of like Microsoft is trying to take over their market share in this area and I think they’re also going to have to be very competitive this is where I think right now we’re we’re like a three horse race I think right it’s it’s the Microsoft powerbi fabric it’s data bricks and it’s Snowflake and snowflake is by far the most expensive of the
45:58 is by far the most expensive of the three solutions like if you want snowflake you’re spending a lot of money to deploy it so we’re talking about you these teams are going to need to continue to optimize these Solutions down to make them the most efficient thing and this is also the argument I’m making around like data flows versus using a fabric notebook if you’re in powerbi Pro or or premium per user use data flows to your blue in the face as long as it gets the job done when data gets extremely large though data flows don’t don’t don’t perform so well right if you’re talking hundreds of
46:28 right if you’re talking hundreds of millions of rows I’ve seen data flows just fall over so it’s not bad it’s just that’s a limitation of the way they built the tool okay it’s good for smaller medium-sized data when you need big scale data hundreds of millions of rows or I’m going to store every changed record that comes out of my operational system I don’t want to store that in a SQL Server it’s too expensive so now the whole reason we’re going towards this fabric route and in any direction is we’re looking for more efficient ways to do the same work and I I think that one I I love that
46:59 and I I think that one I I love that point you’re making me think here because I almost would argue a bit that data bricks and snowflake are getting worried about fabric at this point in time because to me data bricks and snowflake are still the stick shift of data engineering like you go into that what you’re doing that’s not a I’m GNA just start and my my boss tasked me with doing a data brick solution and we’re going to figure it out as we go fabric again this idea where I have to be a data
47:29 this idea where I have to be a data engineer or I have to come from that background or if I do anything in fabric then I’m no longer working in powerbi is to me just false it’s is misleading and it’s not what fabric right now I think is intended to be yes can it do amazing Solutions and Encompass a lot of solutions but it’s the automatic drive it works and it can drive and it can go fast but I don’t have to worry about all the setup and all the things that I had to do from if that was my back background a lot of people going to fabric right now are not the data
48:00 fabric right now are not the data engineers and this is Mike this is actually really CH my training when we started when I started working on fabric training it was like okay what are the personas well you’re a data engineer and you’re coming to fabric and you’re a data scientist you’re coming to fabric it’s like no that’s not it at all at least right now and again I’m speaking on January 15 of 2025 or whatever the date is January a year into GA the majority of people that I’m finding that are coming into Microsoft fabric are coming from a business intelligence and
48:30 coming from a business intelligence and business background more than they’re coming from a data engineering background I’d agree with this and I think very important distinction yes and this is what I’ve been saying from day one when fabric came out and Tommy we’ve argued about this a little bit I don’t agree with the personas of that Microsoft has put out here I think yes there’s a data engineer but to you there’s a data engineer but to Sam’s Point here fabric is light if know Sam’s Point here fabric is light if you’re coming at fabric with a data engineer perspective you’re you’re not going to get all the bells and whistles you’ve traditionally had with Microsoft
49:01 you’ve traditionally had with Microsoft products or other products for that matter it’s just not there yet they’re continually adding new features the Gap is continually closing but there are certain things in asro data Factory you can’t do in a pipeline yet okay however what’s making this compelling for me is there’s new features that are showing up and those new features are fabric specific like I can’t do direct Lake without fabric direct lake is amazing they’re they’re doing things inside the fabric ecosystem that are making it incredibly important I was having a conversation with someone and they’re asking me what what what would
49:33 asking me what what what would power what would make powerbi better for you and my point blank answer was I’m doing so much work in engineering of data over in data bricks I need the lake house to be right next to my semantic models that’s what I said and then two years later fabric shows up I’m like wow there someone was listening like this is
49:52 there someone was listening like this is not this is not so Microsoft looks at their landscape so they have the privy of being able to see how many people are building data bricks where the data is going how many blob storage accounts are showing up like they know where the market is moving right so they’re probably seeing Trends in this market way before we are because we’re on the output of they have to build a solution to actually work and then we can start seeing the trend Microsoft sees the trend way way earlier than we do so what’s driving Fabric or
50:23 do so what’s driving Fabric or even Cloud compute for that matter right there’s a proliferation of data centers in the US the cost to run compute in the cloud is continuing to drop and we need less it’s it’s becoming cheaper for me to run it in someone else’s environment at scale than for me to run it on my own on-prem server house and I don’t have to pay for people to run infrastructure I can remove that cost and move it more towards the analyst and and using their brains to manipulate the data this is the trend we’re going to go so I agree
50:54 the trend we’re going to go so I agree with you 100% Tommy the people coming to fabric fabric more of them and if you think about it you will for every one data engineer that comes to fabric you’re going to have like 10 analysts that are coming to fabric just think just think purely about the size of volume of Market yeah right yeah to go through data engineering you got to go through specific classing you’re going to go through like a maybe Information Technology major right you’re going to do things that are specific to that that role right that’s one major by itself you look at the Spectrum of what people
51:24 you look at the Spectrum of what people live in the business there’s all over the place you have anyone can go work in the business and like there’s a there’s a lot of different degrees there’s a lot of different technology and sometimes I of different technology and sometimes this is a good example of Seth who mean this is a good example of Seth who was on the podcast previously he was an artist and he found I like SQL I like data things and then moved more towards that route so while he wasn’t a business Centric person his degree doesn’t really need to be information technology for him to work inside fabric now and now he’s a director and so like wow that’s
51:54 he’s a director and so like wow that’s amazing but I think the size of maret audience you’re going to get more analysts there and Microsoft is continually building like the low code or buttons experience around doing data data engineering and to your point Mike when you find a powerbi college degree call me call me let me know when that exist because it not and let’s not forget and this will be my closing thought Microsoft is always going to go from the simple to the complex powerbi did not become conceptualized or was created because
52:24 conceptualized or was created because the Tableau and all the Advanced Analytical tools it’s because someone looked at what was happening Excel with power query and power pivot and said I wonder if we could do something with this they were Excel tools powerbi is literally the for is a son or daughter of EX power query and power pivot that existed in Excel for years before powerbi that’s why we have powerbi it did not come from we should do something really Advanced we should create our FBI tool yeah it yeah you’re right so
52:55 tool yeah it yeah you’re right so powerbi wasn’t evolution of excel the world’s largest data engineering tool already right it’s it’s already got hundreds of millions of users using Excel it’s it’s a massive program easy as possible to use is again an automatic drive how can we make this as easy possible to transition and to me the Lynch pin a lot of this was the analysis Services engine the analysis Services engine changed how everyone thought of doing data and analytics that that engine alone Coler storage inmemory compute that’s the technology that
53:25 compute that’s the technology that shifted Microsoft on this path because that’s the analysis surfaces that was built into Excel that’s where pivot tables came from amazing now here we are multiple years later and now we have this fully interactive dashboard that was around Tableau was there doing it but this is like a brand new world and this tool is just and it’s I have to I have to give a lot of credit to powerp because it’s literally changed my career I’ve gone from mechanical engineering and gone into full Consulting
53:55 and gone into full Consulting mode around Fabric and Power powerbi this stuff I love this stuff this is great stuff and I think there’s a there’s a large need for it I’m fired up man because last thought my final closing thought is powerbi allowed people who had no business being in tabular models to work in tabular models and organizations with SQL analysis Services engine and fabric is just that evolution is people had no business and people have no business to be in lak houses and warehouses to be able to do so this is not the other way around to
54:25 so this is not the other way around to me so and I’m feeling and more of that when you looked at powerbi N9 years ago 2015 and it just came out we were so excited about this new technology oh wow this is and I think our initial our initial blush was wow this is so easy to get started it was so easy to like download a desktop for free and start I could make I could go from some random files that I have into a report that I could click on things and multiple things interact within like a little bit
54:55 things interact within like a little bit of time so that that started our Intrigue with with powerbi they’re trying to leverage that same level of intrigue with fabric I think the same thing is happening right we’re in gear one of fabric at this point early we’re early so what does what does fabric look like five years from now 10 years from now right when powerbi has been out for 20 years and power and fabric is now out for another 10 years right what does this world look like now Microsoft is taking a lot of lessons learned from all their other data products and bringing them directly
55:26 data products and bringing them directly to to fabric I think this is going to be I disagree with these topics I think I think the these two individuals are very critical of fabric and I think they’re missing the Mark I think they’re not seeing the fact that we’re just starting out this is going to become part of our new workflow and our process and I honestly think that fabric will be the go-to solution it incorporates a lot of other tools into one big solution I don’t have to buy a bunch of stuff so I think the ease of
55:56 bunch of stuff so I think the ease of being able to turn things on is making this a commodity for people Microsoft is going to try to continue to reduce the complexity the issues the challenges and they want you to deliver value very quickly from your data wherever that may be coming from so I really like this I think I think this is a good topic I think this is definitely something to evaluate and always keep critically thinking about does fabric fit for you is this a skill set that you want to get into or is this something that your organization needs and so my final
56:27 organization needs and so my final thought here I basically is you’re going to continue to see more and more analysts move into what I would call an upskilling into Data engineer that will continue to be the common Trend I think as the fabric tool becomes more rich and robust we we will eventually start seeing more data Engineers choosing fabric over other thirdparty tools to start their work but right now I think we’re still at a little bit of a resistance there’s still not quite enough features to make sure that experience is incredibly smooth so I think we’re getting there it’s going to
56:58 think we’re getting there it’s going to get better but I’m very regardless I love the fact that I’m seeing monthly blogs with all the new features coming out for fabric so if anything if you want to know what Microsoft is interested in look at where they’re putting their money look at what they’re building what features are coming out new if there hasn’t been a new feature in data Mars in a year don’t use data Mars right it’s probably not going to go very very long but if you see them building a lot of features around like as your dat Factory or pipelines or like where those new
57:28 pipelines or like where those new features are coming out from that’s where I want to be spending my time because Microsoft is investing in it meaning they’re listening to feedback they’re changing it they’re going to trying to make it better and better over time so that’s where I’m going to spend my time I’m going to try and keep myself aligned to the Microsoft blogs and understand what they’re building there and as I continue to see developments that’s going to tell me where Microsoft’s putting their money and their development time anyways we really appreciate you listening to this podcast this is a really hard topic to go through I think it was good to unpack
57:58 through I think it was good to unpack this a little bit but thank you for listening I hope you are exploring more about fabric as well and we’d love to know your feedback as well give us some comments in the comments below let us know what you think about Fabric in your environment how are you finding easy wins do you see more business analysts moving toward our Fabric and not as many data Engineers excited about Fabric and over the last year have you seen that transition shift is it changing for you maybe I don’t know that being said U we really appreciate if you would let someone else know you like this conversation and you enjoyed the podcast
58:29 conversation and you enjoyed the podcast Tommy where else can you find the podcast you can find us in apple Spotify or wherever you get your podcast make sure to subscribe and leave a rating it helps us out a ton and share with a friends since we do this for free do you have a question idea or topic that you want us to talk about just like today well head over to powerbi. com
58:59 media channels thank you Greg thank you Sam for the questions and the the thoughts around your topics on this one I don’t think I agree with you but that’s okay of disagree it’s a shami Mushi so we want to thank him for the question well shami thank you for the question but also Greg and Sam I also want to thank you for your opinions like your opinions were great Center in this article good thoughts good opinions I don’t think I agree with them but that’s okay we’re allowed to disagree cheers thank you all we’ll see you next time [Music]
59:48 [Music] out out [Music]
Thank You
Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.
Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.
Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.
