Using Only Bronze? – Ep. 391
Mike and Tommy tackle a common Fabric design question: can you ship analytics by building only a Bronze layer, or do you really need Silver and Gold. They break down what you gain (and lose) when you skip refinement layers, and share practical rules of thumb for keeping models trustworthy, performant, and maintainable.
News & Announcements
The episode opens with a FabCon 2025 heads-up: when registering, select Microsoft Partner for “How did you hear about this conference?”, enter PowerBI.tips as the partner name, and use discount code PARTNER200 to save $200.
-
Power BI Theme Generator (Tips+) — A quick way to generate and manage Power BI theme JSON for consistent report branding. If you’re trying to standardize visuals across many reports (or multiple client workstreams), this is an easy win that reduces rework and helps teams ship faster with fewer design debates.
-
The Podcast page on PowerBI.tips — The home for the show, including ways to listen and follow along live. If you want to browse recent episodes or share the podcast with your team, this is the best starting point.
-
Follow Mike Carlo on LinkedIn — Mike shares episode links, Fabric/Power BI thoughts, and practical lessons learned from consulting work. A good way to catch updates between episodes.
-
Follow Tommy Puglia on LinkedIn — Tommy posts perspective from the “how do teams actually run this in production?” angle. If you like the governance and operational side of the podcast, his feed is a solid complement to the show.
Main Discussion: Using Only Bronze in Fabric
The core question is simple, but the implications aren’t: Is it ever “correct” to build a Fabric solution where you land data in Bronze and then model/report directly from it?
Mike and Tommy’s framing is that the right answer depends less on what the medallion diagram says, and more on what you’re optimizing for: speed to value, trust, performance, reusability, and the long-term cost of change.
What “Bronze-only” really means
In practice, “Bronze-only” usually means one (or more) of these patterns:
- You ingest raw-ish data into a lakehouse (or warehouse staging), then let downstream models do the cleanup.
- You apply light transformation but keep tables close to source shape (wide, transactional, lots of columns).
- You accept some ambiguity (late-arriving dimensions, inconsistent keys, evolving schemas) because it’s “good enough” for the current use case.
The benefit is speed: fewer pipelines, fewer handoffs, fewer places to debug.
When Bronze-only can be a smart move
They outline scenarios where Bronze-only is defensible:
- Early exploration / proof of concept — You’re validating a business question or data source, and you don’t yet know what should be standardized.
- Single-team, single-use semantic model — The same small group owns ingestion, modeling, and reporting (and can move fast together).
- Low-risk reporting — If the business impact of “we had to correct that number” is limited, you can accept more iteration.
The key is being honest: you’re trading structure for speed, and that trade has a shelf life.
Why Silver and Gold exist (and when you’ll miss them)
As soon as more people depend on the data, the costs of Bronze-only show up:
- Trust and consistency — Without a refined layer, every model (or analyst) can interpret the same source fields differently.
- Performance and scale — Raw transactional shapes often lead to heavier model refresh, more complex DAX, and slower queries.
- Change management — Source schema changes and business logic changes get entangled, so every change feels risky.
Silver/Gold layers are basically an agreement: “this is the standardized shape and meaning of the data.” They’re not about purity — they’re about repeatability.
A practical rule of thumb
A recurring theme is: if you can clearly name and describe a reusable business entity, it’s probably Silver (or Gold).
Once you’re defining conformed dimensions (Customer, Product, Date), stable fact tables, or standardized calculations that multiple reports will share, putting that logic in a durable layer pays off.
Looking Forward
The takeaway isn’t that you must build a perfect three-layer architecture on day one — it’s that you should be deliberate about when you’re optimizing for speed versus stability. Start simple when you need to, but add refinement layers as soon as you see duplicated logic, conflicting definitions, or rising “data trust” debt.
Episode Transcript
0:31 good morning and welcome back to the explicit measures podcast with Tommy and Mike good morning everyone welcome back to the podcast good morning Mike good morning morning peoples well I’m going to apologize right away at the very beginning here at the episode my throat feels a little sore it’s a little scratchy been dealing with a little bit of a cold I got that wintertime cold coming through now deal with the remainders of that so that’s all right I have my my my routine I’ve got the things that I need to get myself healthy everyone has their
1:01 myself healthy everyone has their routine when they get sick so we’ll go through that in a second but really quick let’s just jump into our main topic for today today is a mailbag question the question is can I land all of my data using the bronze layer and serve data out from there directly to my customer so is it reasonable to only have the bronze layer of data when you load in data to fabric or into a lake house I think this is going to be an interesting topic looking forward to unpacking what this means and we’ll get that in a second but anyways a couple other announcements
1:34 anyways a couple other announcements Tommy are you going to the fabric conference 2025 that is currently the plan so love the being part of the community to Lounge so nothing’s final yet but I’ve heard so many good things about it from people have gone you went last year y so my plan is to go it sounds awesome I think roughly was around 4, 000 people at the conference last year in Las Vegas e to get to we were staying in the MGM Grand hotel I think is where we were staying I stay in the hotel in the conference center which
2:05 the hotel in the conference center which was huge it was like a multi-level conference I’d highly recommend it it was a good time had a lot of fun met a lot of new people a lot of existing friends that I met up with and hung out there as well with them overall would highly recommend it there was tons of sessions lots of things even I wanted to go to and make sure I jump in and learn a couple things so definitely an enjoyable experience really liked it and would recommend also in description below we have a discount for you so if you’re an avid listener of the podcast don’t say we ever gave you
2:35 podcast don’t say we ever gave you nothing we are we are a Microsoft partner so powerbi tips is the partner name but if you check out you can use partner 200 to get a 200% oh sorry $200 discount on your tickets to fabric conference and then you just use the powerbi tips as the name of the partner where you got the the tickets from so welcome to have that that’s down in the description if you want to check that out it’s on all of our videos across all of YouTube and social media platforms in case you forget to where
3:05 platforms in case you forget to where go get it but give that to your colleagues your friends and give them a little bit of a discount on the conference all right Tommy what do you do when you get sick what’s your rout what’s your get better routine everyone’s got a little bit of routine when they get sick so obviously first it depends on the sickness like if is is it the cold is it the sore throat or is it just the I’m down I’m down kind or is it just the I’m down I’m down thing so the down one I actually of thing so the down one I actually actively try to sweat it off so I will go in the downstairs bed I will put on
3:37 go in the downstairs bed I will put on upon yeah oh yeah I need my I’m not going upstairs where it’s not infect everyone else but just I need I like being in the dungeon like I am alone and a cocoon and I just put on sweaters just work it out just I just try to work it out thing I will say I don’t know if it was when we had the whole thing that happened five years ago I don’t remember the name of it but since then my sweat’s very different unfortun just yeah so it just just smells it’s a weird it’s just upsetting
4:08 smells it’s a weird it’s just upsetting but if I just when I feel the sweat coming I’m like yeah baby so you’re gonna get better that’s weird I’ve never heard of anyone doing that I just my mom had always given me ramen noodle soup so Ramen is like my go-to if I’m feeling sick or I’m coming out of sickness it’s like okay I need to eat some Ramen to get better and then also ginger ale ale has been my favorite so I usually would eat eat a lot of ginger ale around like I get sick I crave ginger ale I have a lot of lot of ice in a glass that’s awesome and then and a filled up glass of ginger ale
4:38 then and a filled up glass of ginger ale and that’s that seems to do pretty well for me so cold and sore throats and I I want to see if this goes across States here from an Italian point of view but for me when I was growing up we had I’ll call I call them immigrant remedies even my third generation so if you had a cold we get a pot of boiling water and my mom would put Vaseline in it and just put a towel over your head you’re like I can’t breathe I can’t breathe it’s like stay oh yeah oh that to me that was
5:08 stay oh yeah oh that to me that was normal and the you go to Junior High and you say you do that too no one does that no one does that so we had some weird things like that but that was the worst and I don’t know if it helped no I don’t think I could smell anything anymore but my I’m not so I’ve never had anything like that I we never did like home remedies like that more was like just run to the doctor I have a lot of like allergy type stuff so a lot of I remember growing up a lot of my sickness and stuff was related around allergy related things nose
5:40 allergy related things nose congestion that stuff so you give me a beach towel and hot water and we’re golden good to go oh man well we’re hoping everyone stays healthy and get through if you’re going through the the seasonal under the weather type feelings we hope you all get better and we’ll feel better soon so for those of you who aren’t sick stay healthy don’t keep keep getting your vitamin C and go outside and see the sun when you can and hopefully that’ll keep you healthier longer yeah and if it’s not Vaseline it’s what’s the
6:11 and if it’s not Vaseline it’s what’s the one that you put on your chest or like your oh that’s vix Vapor Rub that’s what I would did vix in the water so if you’re listening not endorsed by the podcast toi hot water we’re just want to put that out there that is not one of our Consulting recommendations we we are what what do they say this is not Financial advice right this is not Medical advice yeah this is not medical advice this is just things our moms did to us when we didn’t know what was going on when we were younger not one of our best practices that’s amazing awesome all well jumping into news items any kind
6:41 well jumping into news items any kind well jumping into news items any news topics things that are out there of news topics things that are out there Tommy that you want to discuss or talk about honestly we did our big draft and I think that’s something we’ll continue to do is as Fabric and powerbi updates come back we’re out we’re going to do that draft format but besides that like we’re we’re getting back into it I think with all the news stuff I do want to throw a big comment here from ruy Romano yesterday he on January 16th he dropped out a really really interesting blog a deep dive into timle view for powerb desktop drewy also came on the channel
7:12 drewy also came on the channel last week I think it was Wednesday and he and I went in a deep dive all the way through the Tindle view all the the information the details there it was quite elaborate I’ll definitely High highly recommend the YouTube video about this topic this will change how you build things in desktop 100% every single user of this now if you’re a brand new user you’re probably not going to want to dig in a lot but it’s still useful in small doses I think the timle view could make it really easy for editing multiple measures checking
7:42 for editing multiple measures checking that things are built correctly so I do think it’s going to be very useful for people highly recommend it go check out his blog the blog post here is actually very informative and one of the things that really does which is I think very encouraging here is he’s trying to give you like practical examples of where you can use timle view so I really recommend it tmdl view is coming to desktop it’s already out now when you download your latest version of desktop it will be there it will change how you do your work daily to day so and really
8:12 do your work daily to day so and really recommend that this is that’s a good news item because I’m seeing a lot of people have a lot of Buzz about Marco Russo’s been talking about it too so to talk about adoption you want people like you you want people like Marco Russo and a ruie and other people that are very prominent to talk about it because that introduces the concept right rather than just and I don’t I think Dax query maybe it’s gotten love but I don’t think it’s G to get as much love as symol VI I think you’re right on that daxie I think is extremely useful but timle view
8:44 is extremely useful but timle view does things that you traditionally couldn’t do in the semantic model like you couldn’t make or edit a perspective you couldn’t change various properties inside the semantic model you couldn’t add annotations like it’s all these extra things that the semantic model provides you that you just couldn’t easily modify well now this is great because anytime they want to bring a brand new feature to the semantic model it’ll be immediately exposed as part of timle view so any new feature they bring in will 100% be supported by
9:15 they bring in will 100% be supported by power desktop I think honestly Microsoft got some feedback think about it right you’ve got this semantic model you’re you’re editing you’re building it in desktop and imagine having a tool that the creator of the semantic model gives you that you can’t do everything in you need third party tools to do all the extra manipulation pieces and so Microsoft I think got some really strong feedback that they need to be able to edit the entire semantic model in one program has to be Power by desktop so this is why I’ve been saying for a number of years now and Tommy I think you’ll you maybe will more agree with me
9:46 you’ll you maybe will more agree with me recently here is the desktop is becoming more of that Pro developer tool now that we have two more windows that are purely code based windows inside desktop it’s going to be the prodev tool and I think
9:59 going to be the prodev tool and I think you’re going want to use more for that easy to use experience it’s going to be web based more stuff more and more as as desktop gets more and more Pro developer type type tooling I’m going to push back a little I’m it’s not a full push back it’s a one-handed like on the shoulder like hey this is so this is not a full like football push back and these are the two reasons why one they’re buttons that are optional is not really prominent especially if you don’t enable them right so I can I can still go about my
10:31 right so I can I can still go about my business every day currently they’re in preview so they’re not all automatically out yes and obviously they will be but if I was a Hermit or a monk in Germany and I just didn’t have the internet but I did powerbi take just run with it don’t know how this it relates at all okay Tommy keep going I see you’re one who doesn’t read up on the blog doesn’t need all the latest greatest thing you need all the latest greatest thing my company pushed powerbi desktop know my company pushed powerbi desktop to my laptop thing right look I don’t really know what’s going on it’s just here I’m going to start using it like you’re not digging in and actively
11:01 like you’re not digging in and actively researching it I would challenge you even if you’re a weekly developer or weekly author in powerbi desktop you don’t have a incentive to use use timle probably not yeah probably not so probably not the more the more you model semantic models the more you do data modeling the more you’re going to want to use it and again ruy clearly states in the blog this is a pro developer tool for semantic models and desktop so
11:31 for semantic models and desktop so highly recommend it super good video I’ll put the link here for the video here down below so everyone else can check it out and watch the video it’s getting good views already so I think people will really like to engage with this part of the product that any other any other news or announcements Tommy is there anything on the fabric blog at this point I don’t I don’t know when fabric rolls out its blog posts I’m not gonna say not worth talking about there it’s a lot of I call it admin things there’s a log management maybe we get into that another day service
12:01 get into that another day service principal support for fabric data warehouse speed up your SQL database with a performance dashboard so I I think a lot of performance enhancement types of items not so much UI or features at this point surge protection is another one which I’m like for my outlet like how how much power are we cut are we getting here so but that’s really I would say the biggest items and then like SQL database and workspace roles with item Miss so you’re I think we’re going to see a lot a lot with SQL
12:33 we’re going to see a lot a lot with SQL database features and and just enhancements coming up yeah SQL database is out in preview right now and it’s it’s just I think they just get it out the door like hey here it is you can use it the next layer of features are going to be more about security authorization getting you getting the administration portion of it corrected and More in line with like I corrected and More in line with like fabric has its own way of doing mean fabric has its own way of doing things SQL Server had its own way of managing users and content and things as well so Microsoft’s got to figure out the of how do you bring a SQL Server into Fabric and still give it the same
13:03 into Fabric and still give it the same level of control that you had previously in SQL but all the additional richness that comes with fabric what is that going to look like so I think I think they’ve got to figure out that story a bit more and they’re like listening to to the here it is sqls here Community what do you want so I think it’s very important that the community if you want various aspects of the SQL server in fabric please be vocal go out and make stuff on the ideas. powerbi. com site it’s extremely important that you make comments about what you want because the product team is actively
13:33 because the product team is actively listening for those things and will change stuff according to the community we’re about at the 90day mark of databases being announced so let me ask you how fast are you running with SQL databases are we a light jog are we Turkey Trot run are we doing sprinting at this point where are you right now with SQL database in terms of your actual use of it in your data day yeah I do a lot of things with the Lakehouse a lot of my reporting is not operational reporting most of my reporting is slower than that like once or twice a day reporting
14:04 like once or twice a day reporting pieces that’s and and most of our customers don’t need a full SQL Server to do these things I am getting a lot more questions around companies that are saying hey we we’re doing things like budgeting or we need things that are going to be a bit more fluid and need real-time updates to things as a team of people work on planning things for the future that’s where I think the SQL Server makes a lot of sense right now it’s going to give you that that highly available system that’s going to allow users to quickly enter or add records into tables
14:36 quickly enter or add records into tables and then immediately visibly show those inside power reports with directquery but I think that’s going to be a very big win we’re seeing people explore it right now I I don’t have any customers actively building on top of the SQL database right now for production or or a project yet but I think this will be in the near- term individuals areen going to want to start looking at it I have been doing some additional testing on it so I I have a Works Space that I do a lot of testing with yeah I just turned on the SQL Server one of one of the observations that I was very pleased to see and this is going to be very technical in nature but ask questions if it doesn’t make
15:08 but ask questions if it doesn’t make sense with some features in fabric you turn them turn them on every day they cost you compute units one of those is realtime analytics so if you turn on an event Hub or event stream you turn on an event stream it’s on all the time and every day you’re seeing a little bit of compute unit just just being dripped away cuz the machine turned on right it has to be highly available if you’re sending has to be listening to the data that’s coming in so there’s always some bit of compute that’s being consumed and I
15:38 that’s being consumed and I thought hm I wonder how SQL Server will work is it the same experience like if I turn on the SQL Server it’s always on and it’s going to continue to listen or be available in my like well I see it constantly being used in my compute usage so I have gone to my workspace I turned it on I played with some SQL I made some tables looked around with it a little bit I saw the CU computer come through the analytics reporting and I left it alone I didn’t touch it for a couple days and I was pleasantly surprised that there wasn’t the slow drain of compute us usage when the SQL
16:10 drain of compute us usage when the SQL Server didn’t wasn’t actively being accessed or or data wasn’t being read from it so that’s big that’s really big because I was really worried like if I turn on a SQL Server is it going to consume like a th CU a day just because it’s on just because I click the button like the same thing that event the event stream does no it does not it doesn’t seem like it’s doing that so as long as you’re using it it seems like it turns itself on now what I don’t know is I imagine it’s something more like your
16:41 I imagine it’s something more like your Spark engine right so when you turn on a Spark engine or a spark compute you can turn it on it turns on and it stays on for a period of time and then it shuts off so it stops charging you the machines d d they they spool themselves down and and pause basically right yeah so I was thinking is SQL Server the same way so things I haven’t done yet is I need to start doing like a daily job on the SQL Server just to like wake it up see what it’s doing and then like let it run for a couple days I I’d be curious to see what that does impact
17:12 be curious to see what that does impact wise for the compute usage to see if it ramps up really high or really low and does does is there a period of time that the SQL engine is running right does is it making compute units for just the duration of running my queries and then it immediately shuts off or does it need like a 8 hours or 12 hours or 24 hours of time before it actually says oh I’m not being used I can actually pause myself right shut down the machines and then I get stop getting charge compute units does that make sense what I’m saying there no this is actually I’m pretty sure that sounds like I’m
17:42 pretty sure that sounds like I’m encouraged but also slightly concerned by what you said because that sounds to me exactly like the basic Azure SQL database if you were to get I think it’s like a B1 in Azure so and it’s to your point you don’t the serverless version of that right yeah exactly right yeah and in Azure they have like two kind and in Azure they have like two directions you can go with a database of directions you can go with a database the basic one has basically what you said it sounds like you’re not paying for it just to sit there you pay for when is actually getting called or
18:12 when is actually getting called or transacted but the problem is and I don’t know this is the features available with that because the other version of the SQL database in Azure has a lot more features and a lot more scalability so I don’t I think the equivalent I’m pretty sure I read this somewhere in some of the documentation the equivalent fabric database is like the basic or standard version of a Azure database so you’re limited with features but it does what you basic the basics of what you need it to do so that’s fine
18:43 what you need it to do so that’s fine but then it goes to our other questions that we’ve always had does that meet the needs of any type of database I need yeah like a standard database sure but for more complex scenarios how does that actually scale so it’s nice to know that if I store my data it’s not going to pay for that it’s really just paying for what I’m calling it if I do a refresh a little cost find that’s what you’re used to yes I’m not paying for everything else I’ve done some tests as well not from the cost side but from the feature
19:13 from the cost side but from the feature side okay what do you think about the features what are you what are you seeing from your side of things so to use that running analogy I’m not doing track it’s not track and field yet but I’m doing light jogs in the morning for yeah that’s about where it is because it’s it’s right now so it’s it’s good at least put on your shoes New Year res a mile in the treadmill New Year resolution trying to exactly exactly yeah we’ll see how long you last until until February there may not be many more runs when we hit February it’s it’s
19:44 more runs when we hit February it’s it’s still not as seamless as I would like when dealing with other systems because to me if you’re going to have a database it’s not like I’m not doing a database just so I can do reporting powerbi that’s silly I would agree with you I think I think it is silly and again I
19:58 think I think it is silly and again I think the I want to just unpack your your comment there because I think it’s a really important point if you’re if you’re reporting speed the speed in which you need to have new data come into your reporting solution is more than three or four times a day yeah it’s not probably worth it for you to go after spark and lak houses if you need it more than that you might want to start thinking about databases to actually be that transactional system where you’re getting more real- time access to that data I’m not sure if I’m
20:28 access to that data I’m not sure if I’m going to be putting like billions of rows inside the SQL server but I yeah if you’re if you’re doing like a definition table like I’ve seen people do things like hey we want to have a a lookup table for something and and we want to have like configuration parameters stored somewhere well people are comfortable using a SQL database to do that I think it’s great I think anyways I I think I just want to there’s there’s a scale and a position for this where the data needs to be more real time or more up to- date immediately I think that’s a good use case for the SQL
20:59 think that’s a good use case for the SQL Server sorry you’re keep saying point we already got you just got another topic on the board Fabric or gen one data flows or SQL databases if I’m doing reference to that’s good question on that one but but to your point like if I’m creating a database in fabric the purpose of that it would not be just reporting it better talk to power automate and do well it better talk to power apps or other application based system oh I see you’re saying can input output data right it’s not just reporting that’s what yeah that’s the
21:29 reporting that’s what yeah that’s the point so you you could stand at the SQL Server the SQL endpoint exists but you’re saying it’s a little bit too much friction for you to use power automate or power apps compal SQL database with any other system right and you’ve done a lot of this in the past where you’ve done a lot of power apps on top of a SQL database exactly and I think that to me that’s where I’ve always seen a database be it’s obviously reporting but it dang well better talk to other systems and not just Microsoft systems because it’s
21:59 not just Microsoft systems because it’s a database That’s The Power of it and and you’re talking about when you say talk you’re talking like a two-way talking path right read out integrate integrate and automate so the reason I’m I’m making that distinction right now is because I don’t I don’t think I agree 100% with you in that one I I think of the power the I think of the fabric platform as more of a consumption method a landing zone for all the data so it’s I think of it more of like a read only thing like I’m going to go talk to things that’s how again until fabric SQL you can do some read
22:30 fabric SQL you can do some read write things inside the fabric world but I think of it more of a I have data that exists in many to your point many different sources I’m trying to make I I have production servers that are running things I need to get that data down to fabric as quickly as possible I can shape it and and tune it and join it together right I think of the if I think about the flows of data other Data Systems are doing the inner the in and outs of data editing changing the the updates but fabric is is the
23:00 the the updates but fabric is is the system more in charge of collecting everything into one place but I think of it as like the bottom of the funnel maybe is how I think of it you’re think I think SQL databases change part of that mentality it’s not the bottom of the funnel anymore you have the ability of building an application inside fabric that’ll be interesting I’ll be interested to see how that’s going to what people will do with that I like that you and I differ there because I think that’s that’s the point that you differ on is the thing I’m most excited about fabric where powerbi and the data
23:31 about fabric where powerbi and the data platform is no longer the last stop on the train in terms of where your data is going to your point the end of the bottom of the funnel it’s been elevated where it’s like well you can get off the train you can take a left there’s a lot of more directions your data can go yes I agree that recording well and I think that’s actually not a bad segue not a bad segue at all we G to go into our main topic today Tommy give us our main topic for today first again thank you to everyone who’s submitting mailbags we’re trying to go through all of them we love the questions and if you want to know how we pick it it’s when we if me and
24:02 how we pick it it’s when we if me and Mike both go oh yeah yeah that’s a good one let’s be real here Tommy if your name ends with Mike or begins with Mike your topic will get picked this is another M name ends with d Carlo this is who who are all the mics all the mics must listen to the podcast or maybe no one wants to give their name they just keep saying this may become a running joke like no one actually submits any questions to the podcast it’s just me submitting them or everyone’s using my name to submit questions back to the podcast I am not going to say publicly I’m biased towards
24:32 going to say publicly I’m biased towards Italian Italian names no no no this question comes from from Giovani this is a guano Luigi Gian Luigi yeah I know but honestly we we love the questions that come in so let’s let’s Jump Right In All right we to off this question from another mic with fabric do you think it’s reasonable to have the bronze layer as the only source of data think different from the typical data warehousing replication of a source
25:03 data warehousing replication of a source system but storing around 100 gigs of unstructured data in the bronze layer rather than on a local n network drive can this be considered a more accessible form of blob storage hot storage Cold Storage would this be the cost prohibitive or inappropriate to the only source of some of the data that’s a great question so there’s a couple there’s a couple questions impacting in here right so I think the first main question is should bronze be the only source of data and I think
25:34 the only source of data and I think there I have some definitely some clear thoughts around how I feel about that one and then I think the next question is here is if you’re is there a size of scale of data right so there’s a scaling size thing here then they asked the question around hot or cold storage which I definitely want to unpack that there’s some think limitations with the fabric storage inside the lake house it’s it’s you only get hot storage you don’t get cold storage there but you would have patterns on how you would use hot and cold storage and I believe the storage
26:04 cold storage and I believe the storage accounts now if you bought an Azure storage account today there’s actually more than just hot and cold I think you have warm storages now too so you have hot warm and cold storage and you have archive there’s like four layers of ways you can store data in case you need to store information so yeah let’s unpack those oh man yeah what do you think Tom let’s start with this first question I guess I think the first one is the one that to me is the most intriguing is not even a good enough word here but the idea of using
26:37 enough word here but the idea of using fabric to Only Store my bronze data and use that as the in a sense just a repository of my data is pretty powerful when you think about what a lot of companies go through with trying to get a hold of their data to me and this is I’ll rephrase it a different way if you could get all my data in a fabric and even if it’s in bronze even if it’s just a raw thing but it’s all in fabric y that is a huge win for a ton of companies and a like a major win even if
27:10 companies and a like a major win even if we still have to do a ton of stuff and we’re not there yet yep if you if I can push all my data from all these different sources Excel blobs you different sources Excel blobs other SQL databases even and it’s know other SQL databases even and it’s all in a centralized place that I can locate I guarantee you at least the majority of companies work with would sign up for that today if they knew that was available yes and I think I I would call this a huge win if you were able to do do this so I’m not saying if I think I’m I
27:40 this so I’m not saying if I think I’m I think I’m assuming the bronze layer ex let’s let me just Define what I interpret the bronze layer as so the reason I think this question is also coming up is they’re talking about The Medallion architecture bronze silver and gold right those are the three different layers bronze is defined as raw data like I make an API call the data that comes back is a straight Json object just it’s just Json it’s a file that is Json that’s a file if you use a pipeline or you use a data flows Gen 2 you can go
28:11 or you use a data flows Gen 2 you can go to a SQL Server hit a table return the data as a table that table then is placed back in the Lakehouse as a Delta table so it makes a series of files table’s written back into the Lakehouse as a table form something that everything else in fabric can read but that table’s already formed into like and easy to retable right those are the the two main there’s unstructured or semi-structured data and then there’s just straight tables of information you’re going to go get the bronze layer is intended to be that raw layer one one mistake I think I find a lot of
28:41 lot of companies overthinking a little bit here is when you build a let me I’m going go back to the SQL Server analogy when you build a SQL server and you’re loading data in you typically have two tables you have a staging table where you bring in the information as is stage it and then from the staging table you do some joins back to your main table on okay I’ve staged the data do I insert records or do I add new records to my main table or do I delete records right maybe there’s delete your handling there too right so the staging table is where you
29:11 right so the staging table is where you land the data before you actually make the final modification to the final table I like in the bronze analogy it’s it’s that’s more silver in the in the landscape of Medallion architectures but what I would say is the bronze layer sometimes people make make a mistake of over cleaning the data over transforming it before it gets to bronze so what they’ll do is they’ll make a data flow instead of just connecting to the data source and load it in with no changes just load it in
29:41 it in with no changes just load it in what they’ll do is they’ll like well I’m going to rename this column I’m goingon to Del these things like but it makes the tool makes it easy right there’s there’s this thought around like I want to be able to modify the information like also thinking to like a SQL connection right in that SQL connection
29:56 connection right in that SQL connection you could write a SQL statement to do whatever you want want you can make a view on the SQL Server you can do Transformations there to make the SQL Server join multiple tables together build a more complex View and then just load the final view into your bronze layer sure yeah these all have tradeoffs right so the the idea here for me bronze is I want the data as close to the raw data source as possible that way I can I can check like literally run a query against Source system run a query against bronze and know the number of records are the
30:27 and know the number of records are the same same the total of sales is the same right so I want to I me personally I want to have the bronze lay layer be as untouched as possible and as close to the source system as it physically can be then to answer the question directly now is it reasonable to have only bronze I don’t think so if you’re doing Transformations before you get to bronze as you’re doing the loading process then yeah maybe you could I don’t really love doing that though I would I actually I think the question is it well is do you
30:58 think the question is it well is do you think it’s it’s reasonable to have the bronze data as the only source of data it’s the the layer I don’t do anything else we’re focused on doing anything yeah to me that’s what what again you may be interpreting the question a little bit differently Tommy but I’m interpreting this as there’s no silver or gold layers I don’t need any additional transformations to get it into something else oh I see that’s how I’m interpreting I think you’re inter yeah I think I think you’re thinking it totally separate I’m thinking about it is like I don’t need any other Transformations and I would disagree with that I would say you need some
31:28 with that I would say you need some layer of Transformations now it may not be bronze silver gold but it will probably be bronze final final form of data right so it may be bronze gold you may not have a silver layer you may be able to do all your Transformations right from the bronze raw data and then that’s why I brought the analogy earlier of the staging versus Final Table Right think of your bronze layer the staging layer where I bring the data in I stage it and then I go to a final version of that table and I do it in Gold the
31:59 that table and I do it in Gold the reason I bring this up is because let’s imagine I have a semantic model that’s refreshing or multiple semantic models that are refreshing if I put that data transformation logic into the semantic model every time I refresh those semantic models I’m reapplying the same business logic over and over and over again it’s costing me compute so what I would prefer to do is compute the data once save it and then only read the saved data because it’s always going to be faster yeah save the final table as as opposed to reading the table
32:30 as opposed to reading the table doing a bunch of computes and then doing something else with it this is why I love the podcast because I need these other different ways of interpretation I would have taken it the way you’re saying it okay if he said that we just went straight to Gold if that was the wording there in my view yeah I was I’m assuming that the comment is like bronze only like I’m only gon I’m only gonna land the data that’s it I think what you were saying Tommy was did all my reporting come from something that’s like in the lake house yeah like
33:00 like in the lake house yeah like basically I don’t care if I’m even doing anything with it should I just store the bronze layer and we’ll just go from there and it’s intriguing you say that and I I want to touch on your the way that you took it which it it’s you could take it both ways to your point I I would agree with you in if if that was the way that the the question was asked like like the point of fabric too is to be able to go through those steps yes right like you through those steps yes right like if I’m doing everything ever or know if I’m doing everything ever or else then we’re missing the point here so I would completely agree
33:31 point here so I would completely agree with that and let me give you just kind with that and let me give you just one other point to my reasoning of of one other point to my reasoning of why I think this right so the second part of the question Mike asks let’s say I’m storing around 100 gigabytes of data from whatever system like this sounds if how much data you’re getting into this into fabric it sounds like you’re coming from something structured right it’s not unstructured data right so it’s probably a SQL database it’s probably some structured tables of data because you structured tables of data because roughly the size of how much data know roughly the size of how much data is coming to fabric even if like just humor me here for a minute if I
34:02 humor me here for a minute if I brought 100 GB to fabric and stored that for a whole year every gig you store for an entire year is 21 cents so that means a 100 of them 100 gigs would cost you $21 to store all the data so the reason I bring this up is that’s there is not any amount of of compute you can buy for an entire year that can do raw Transformations or like stuff on data the compute will always be more expensive than just pure storing the information so because storage is so
34:33 information so because storage is so incredibly cheap even if I had to double that let’s say I let’s say say I take 100 gigabytes and make it inside bronze and I make another 100 gabes in silver you’re talking $40 a year I guarantee you that data is adding more than $40 of value to you I would probably even add I would probably even say that data is probably making 10 times the amount of value so yeah take $21 let’s just say you did a terabyte for a year we’ll just multiply that by another zero okay 20 $212 $210 for a year that’s like nothing
35:06 $212 $210 for a year that’s like nothing you spend that much amount that much money on fabric like on the low F yeah so the the fact that it’s so cheap to store things like I want you to feel super like have little resistance to just bring everything to fabric just get it in there and then once you have it there then we think about the most efficient way to compute it and bring it out into reporting cuz there has never been any project I’ve ever been on where the source system has data in the exact right format that I need for reporting there’s always some level of transformation happening and
35:38 level of transformation happening and even if it’s clean doesn’t mean it’s ready for the reporting and for the business because yeah so there are some Context Clues here that really took me in the dire or take me in the direction of that the idea or the concept here Mike of we are just focusing our number one priority regardless of if we’re going to do anything with it is we’re going to take the data and store it in Fabric and then allow people to connect to it but that’s number one goal I don’t care about silver I don’t care about gold I just want unstructured data in my
36:08 gold I just want unstructured data in my system the one is obviously he used the word unstructured but he’s talking about Azure blobs yeah he’s talking about a network drive to me these are Excel files and this is all just your give me your tired your poor so to speak of give me your data give I don’t care in what format it is yeah rather than trying to do anything with it and I like this and I like this a lot Mike and this is why and this idea of that’s that would be a project for for fabri if you recall on a early podcast
36:38 fabri if you recall on a early podcast we did we talked about mental assumptions and like mental misconceptions one of those was about the J the jar the options of choice or the crisis of too many choices I think a lot of people are going to fall in a trap and a lot of organizations are going to fall in the Trap with fabric where I have to do everything I have if I get data in I have to do bronze silver gold I have to do all the Transformations I have to do a database or I have to do
37:08 have to do a database or I have to do the direct Lake for all my projects otherwise it’s not worth it and I think that is a huge misconception and missing the point of fabric it can do all of that yes just like you can go to a sushi buffet and eat everything well you could go you can go to a sushi buffet and you can order like f like you can do it but is it probably the best option probably not you’re probably going to get better sushi at the sushi bar than just a steak right and I’m not going to argue or I’m
37:39 right and I’m not going to argue or I’m not going to deny that I may have ordered way too many t Sushi yeah two orders of want TOS yeah how many five Philadelphia rolls please yes I I love anyways well or or Tommy goes there and says where’s your pasta menu like you says where’s your pasta menu like there’s no there’s no Linguini on know there’s no there’s no Linguini on this yeah no I I’ve said you’re like all you can eat we’ll see about that so I’m gonna put you to the test that’s my son when it comes down to Pizza eating pizza and things he’s like he he will we we call him the he has a Hollow Leg because he can fit so much pizza in his mouth
38:10 he can fit so much pizza in his mouth we we we literally think about now is when we ever we order pizza we like okay we need we need a cheese pizza for the family we need the pepperoni pizza for the and we need one pizza for him he gets his own pizza that he can just eat I I I I like my nickname that I’ve given myself as a garbage disposal yeah if it’s a left it’s going away but as a as a dad though as an Italian dad like I’m we we do that like we we clean it down like we that’s we I can’t I can’t stomach the ability to like throw food away we’re just not doing that no no I you we you use
38:41 doing that no no I you we you use everything and everything thing it does broccoli have a we’ll find a recipe yes exactly I and not to go more Italian on here but I made the best pen alaka I’ve ever made on Sunday I will send you the recipe but just and I use Chachi PT a little okay right that’s fair yeah and anyways we can talk about that later hold on as as since we’re talking pasta I got to give you a mic our family has been starting to make pasta on our own fres fresh pasta you okay what flow are you using I don’t know yet I I I made a big mistake I’m
39:12 know yet I I I made a big mistake I’m not a really good Italian I I bought the wrong flower I went so Tommy I I I pulled so this is Michael personal experience this is a revealing moment here so I have to apologize for everyone who’s actually really Italian listen to show we’re open together we’re coping with this right so I went to the grocery store and I said I’m going to go find I’m go to the Italian section like we have like cheeses and meats and things like from Italy like great I’m going to find the right flow that’s going to be the best for pasta and I walked around and I found a bag and it had nothing but Italian on it I’m like this has got to be it this is I I you
39:43 this has got to be it this is I I you this has got to be it this is I I where you’re going with I know know where you’re going with I know where so I brought it home and I can tell you I worked for an hour trying to make the dough and it would never stick together I was trying to make pasta out of like the pizza dough like the dough
39:55 of like the pizza dough like the dough that you would make pizzas out of like no oh I knew that’s where it was going I’m like it’s allong great for 900° terrible for pasta no no no no not good so anyways that aside I have learned my lesson we’re now getting different flower is it the double zero flower that I should be using yes Pudo double zero flow I actually I’m you can buy it on Amazon I’m sending it to you right now this is really really yeah and it says Pizzeria on it but what we should we’re going have to change the the podcast data is like pasta
40:28 the podcast data is like pasta I I I’m totally down for this so this is the one you got to use this is the one from from not we’ll continue that we’ll do another podcast for another for another episode for another episode back to what you were saying Tommy we’ll take it back we’re back to sushi options options this is something I’m feeling more and more strongly about with fabric where yeah you could do all of it because you can it’s all available all the buttons and if I want to do every lake house but that doesn’t mean you have to nor does it mean it’s the best choice to go back
41:00 it mean it’s the best choice to go back to the question here for some of you going I’m still on Sushi the idea here is if I’m just going to put the bronze layer no I have to do silver gold that’s not the case because you may not have the technical skill you may not have the governance or the the the long-term thinking here the win and a huge win for a lot of organizations if I were to say tell them we’re just going to store your data in Fabric and then an on ad hoc basis will do Transformations and figure it out as we go is a complete and wonderful idea
41:33 we go is a complete and wonderful idea because again I don’t have to make every decision with a lake house I don’t have to go through all the Transformations at once it’s too easy to think that way and I think a lot of organizations are going to fall in this trap where they’re gonna have too many projects at once because they’re storing the data now we have to do all the we have to do this Medallion approach that Microsoft’s pushing but that’s not always the best case maybe it’s just the unstructured data in a SQL database and we deal like it like we dealt with powerbi Mike have you ever
42:03 dealt with powerbi Mike have you ever dealt with a powerbi semantic model that was not gold data I have all the time because you didn’t have the luxury of everything and you make it work with a data flow gen one or a lot of steps in power query but that doesn’t mean that that’s gone away I can still do that but guess what in this scenario all my data is in fabric I can locate it it’s not like wait what SQL database or what Excel file is that
42:33 SQL database or what Excel file is that coming from and if it becomes a priority changes can be made on a hat hoc priority priority basis yeah and and Tommy I’m gonna I’m gonna not challenge you here but I’m going to maybe add some more context to this one too as well you’re you’re basically speaking to this in a way that is saying we’re assuming everyone’s just on board with fabric and going ahead there’s probably a large majority of developers in powerbi that are still only able to use powerbi for whatever reason the organization has decided not to allow anyone use fabric we’re not
43:04 to allow anyone use fabric we’re not going to pay for it what we know how adoption goes it’s a different it’s a different paying mechanism than we were doing with just a Pure Pro user right so in lie of that right I’m thinking to myself there’s a whole bunch of users here who say man I’d really like to get my hands on fabric I really like to get the fabric SQL databases but they don’t even have the ability of getting to that level because their organization just says we’re not ready to do that yet and so they’re only living in data Marts data flows gen one and pure powerbi everything we’re discussing here assumes
43:35 everything we’re discussing here assumes we’re in the fabric space and to be honest the the lon share of users in fabric have to be still just using powerbi yet they’re not yet ready to do fabric so we’re talking about some things here that are interesting so I just want to call that out as like a point that I that yeah because we have to be as we’re talking about bronze data we we can do bronze data with data flows gen one so you make a single data flow land the data but in my experience you’re always going to
44:06 my experience you’re always going to need to transform it or shape it or join some tables together it’s never quite right and ready to go for your centic model straight out of data flow gen one unless you’re doing a lot of Transformations and honestly if you read the documentation on gen one documentation they recommend you make a single data flow to just land the data once the data is landed then you pick it up up again and then do transformations to it to shape it for your final data sets that’s the same thing we’re talking about here with bronze and silver and so
44:37 about here with bronze and silver and so to keep weaving my points together here going back to is it reasonable to only have bronze layers right I’m seeing kind have bronze layers right I’m seeing a couple patterns emerge inside lake of a couple patterns emerge inside lake house development you can take a lake house you can name the lake house itself you can name the lake house the bronze lake house and so all your tables that go into Lake House are bronze related however lakehouses now are able to come with schemas so now you can have the data Lakehouse or data whatever you want to call it data Guide data Hub
45:08 to call it data Guide data Hub whatever you want to call it right the lak house can now have multiple schemas inside it so now you can actually have a schema that is only bronze tables and then you could have all the tables named the way you want for the bronze how you land the data in and then you can also have a different schema for silver or transform formed or final or whatever you want to call it you can have other schemas that pull from the bronze tables and use them in Downstream tables as well so to this person’s question right can I can I only use a bronze layer for
45:40 can I can I only use a bronze layer for data yes you can but is it advisable from an organization standpoint I think no right I would I would a minimum have even if it’s a single Lake housee I would minimum have a bronze layer of schema and I’d have a call it final call it gold call it whatever you want a transformed layer I’d have a second layer in there that’s a little bit more refined and ready to go for my semantic model to consume it so Mike I I think you found a treasure map for me here
46:10 you found a treasure map for me here something you mentioned here I I and I really like what you said but one thing I think we’ve struggled with so far into the fabric game both from a documentation and practical point of view is what does pilot programs and adoption look like now Microsoft does have adopt the adoption road map but it’s still more centered on powerbi the fabric things are more theoretical they’re not necessarily logistical in terms of how do I actually roll it out well I don’t I don’t want to say the word theoretical I disagree with your comment on that one I think the fabric
46:40 comment on that one I think the fabric side of things are more unformed yet unformed fine right because I think If I if I had to if I had to say mean if I if I had to if I had to say what’s the ideal solution the ideal solution is one lake house per per workspace and you’re Landing in that lake house multiple schemas and then I’m securing down to the table level of detail with security right those features don’t technically exist today they’re working on them it’s a a work in progress but even schemas in a lake house are I think they’re still in preview so yeah there’s features that I I’m using today as part
47:11 features that I I’m using today as part of like my my workloads that are not quite finished yet and it’s getting better so I your point Tommy like I would say like there’s there’s emerging patterns in fabric right now we’re not there yet we’re not there yet so it’s not theoretical it’s being worked on it’s just not solidified yet to the scale that Microsoft’s provided when it goes manage selfservice small Enterprise like basically how do we roll it out now your comment here I think to me gives me it gives me much much excitement and
47:42 it gives me much much excitement and actually in terms of a very practical way where storing the data is a milestone talk you’ve you want a pilot program for fabric and what does that look like yep dude like just the idea of our first thing we’re going to work on our Milestone here is we’re going to just get our data into fabric then you can do your data flows gen one if that’s what you’re comfortable with here we’re eventually going to migrate but we know to your point organizations are slow to adopt technology and is
48:13 are slow to adopt technology and is we’re we are like we are so many steps ahead of where so many or in terms of where the blog is and the future updates I I would say I would feel pretty safe into assuming any feature announcement you see today is not going to be widely adopted for another six nine months minimum and just that’s just the way it is regardless of preview or GA is just how organizations get around to it so let’s assume that we’re you to it so let’s assume that we’re I Back to the Future Let’s assume
48:43 know I Back to the Future Let’s assume that we’re six months ago with features with fabric for money organizations all that to say a strong selling point for fabric if you don’t if I feel like if you didn’t know where to get started and you didn’t have the skills you don’t have the data engineers in your team you’re all powerbi to get your data just into Fabric and then we can do our data flows gen one off of that and then we can do our Transformations off of that what a solid place to start yeah and and I think I agree with you Tom I think this is this is a really
49:14 you Tom I think this is this is a really solid question a lot of what I’m going to be again i’ I’ve said this multiple times with my clients I say this a lot of times with with when I work with customers you want to optimize for the compute you don’t need to optimize for storage in the past when you use SQL on Prem we had to optimize for storage the reason be the reason this is because is you bought a you bought a piece of Hardware that had a certain limit of size of space on the machine like you bought a SQL server and you bought 20 terabytes of space on it that was it all
49:45 terabytes of space on it that was it all your data must fit in 20 terabytes every day you’re loading things in if you’re going to snapshot data so there was a lot more concern in the SQL world before
49:53 lot more concern in the SQL world before getting to lak houses that you were so worried about much is being stored that was such a constraint this is where dimensional modeling came from this is where a lot of the the hyper efficient storage methods of like I’m going to do a lot of snapshots I’m not going to do a whole thing so the the reason we’re moving to cloud is the storage becomes way less cost compared to the compute side now think about your on-prem SQL Server right you’ve already brought you’ve already bought the machine you capitalized it you you purchase the compute day one you own it so however
50:25 compute day one you own it so however many virtual cores you own a 100 of them whatever you want to use as much of that 100 virtual cores as every day as possible to push data around to do database things do all the things you need to do you’ve already paid for those those cores you want those cores to run non-stop to get the best money out of your to value out of your money right so the fact that you prepaid for for compute and now you want to use all of it I understand that but fabric is different in the way that you can scale up the compute for what you need now
50:55 up the compute for what you need now some degree fabric is similar right you buy a certain fscq and that is the amount of compute you get for the day or for the month or whatever that is U and you buy more FES you get more compute right so that’s the same principle you’re buying dedicated compute capacities but the storage side of things basically goes away so have other thoughts on this too around just the optimization but I do want to answer one last question here I don’t want to kill my comment here and I’m want to fly shift gears there was
51:26 I’m want to fly shift gears there was one other question here that was talking about Can this can be considered can we use this to in a blob storage account way of a hot and cold storage so hot storage is more expensive but it’s cheaper to access read and wrs cold storage is more expensive to access on read but it’s very cheap to write the data in the idea being is I just need to store the data but I want to store it very cheaply and I’m not going to access the information very frequently so my what I know right now
51:57 my what I know right now fabric has no concept of Cold Storage it’s everything’s in hot no matter what however what what I do recommend for companies here if you are really interested in using Cold Storage inside your fabric environment and the fact that you’re even asking this tells me you already know a little bit about Azure already you would go spin up your own Azure blob storage and then once a quarter once a month decide what data you’re not going to use anymore and run a pipeline and physic basically copy the
52:27 a pipeline and physic basically copy the data out of your hot storage and fabric load it into your cold storage inside lake house inside your blob storage and then go back to your fabric environment and delete the data out what that does is it basically if you have a bunch of snapshots if you have a bunch of old data you’re taking down a server and you just want a copy of it right copy it over to to fabric use it for a period of time and if you no longer need it anymore you can always copy that data down to Cold Storage in Blob storage that’s technically still accessible
52:58 that’s technically still accessible inside fabric but you can then get different pricing options on what storage accounts What containers are using hot warm or cold storage to then further reduce your cost to store data but again every time I’ve done one of these projects almost no one is concerned about the storage costs everyone is concerned about the compute cost that is where the real money comes from you’re talking like I don’t know 5 seven8 I don’t know what the number the comparison is but it’s roughly like a 5x 6X cost in
53:30 it’s roughly like a 5x 6X cost in compute versus storage so I’m always going to optimize for yeah how can I bring my data in compute it one time save the results and then just read the results easy yeah no I I honestly I have no argument or disagreement there I I agree 100% with what you’re saying you really don’t want to take this as the same way as an Azure it’s not a cop or it’s not a replication of an azure blob except for the the hot concept here so I would not want to treat it that way no
54:01 would not want to treat it that way no we’re not going to we’re not going to use I’m not going to go into fabric thinking oh I’m going to move this data out of bronze into this cold storage layer inside fabric that today that does not exist yeah that I’m aware of if I was going if if someone really had a hard requirement of like hey I have to move data into cold storage for me I would really push for the data Lake Gen 2 storage account in in Azure and then I would physically copy the data over and then I would out of my fabric environment if I wanted to start saving money that way that’s I I I
54:31 start saving money that way that’s I I I I think that and again we have all the options available the data F gen two are awesome except for SQL databases we’re working on that as still so it’s gonna get better it’s gonna get smoother it’s not gonna bug out on me but no I I I I love that I I really have nothing else there I think we’re probably at closing thoughts now and my honestly for me my closing thought is pretty straightforward is if you’re working and you’re beginning that stage getting to fabric or already have it the idea from a
55:01 or already have it the idea from a milestone point of view to get your data into Fabric and then we’ll deal with what we need to deal with can be a huge win especially if you don’t have the technical expertise yet and you don’t have the maturity let yet the just the win itself to say that my data is in fabric raw or on Raw bronze and needs all the work in the world fine but guess what I have the governance around it I have the Loc around it I know where it is and we can now prioritize from there rather than trying to boil the
55:31 there rather than trying to boil the ocean make your choices wisely but a huge wonderful first step even from a pilot program is get your data into fabric I so I’m I’m love this mailback question yeah I agree with that one I’ll also add one final thought here too is in doing this experience with fabric and and building lake houses and tables and things one thing I find a lot with people is they use a lot of prefixes with their tables so if you’re building bronze tables if you’re building raw data if you’re building transformed data if you don’t use
56:03 transformed data if you don’t use schemas if you don’t have different lak houses for the different areas of data you’re going to need to prefix your tables with this is the user table raw data in the bronze layer so or bronze users right that means you’re going to load the data in you’re going to need to transform every project I’ve ever been on there’s always some level of transformation that’s occurring on the data you’re merging something you’re deleting something you’re in something so my opinion here is the bronze layer should be that as close copy of data to
56:33 should be that as close copy of data to whatever the source system is doing so you can verify that you’re at least copying all the data you need into your system the second part of this is you’re going to need some layer of transformation so either inside that single lake house you’re going to pick up those raw tables and you’re going to need to transform them slightly and so for that reason I think a minimum you would need is bronze gold that’s that’s the bare minimum and as your company matures you do more things you probably will want to have some layer of silver in there in the future but you
57:03 silver in there in the future but you don’t need it at all times so I like this question I think this is great I think this is just really the community trying to understand and unpack like how does this work and I again people think that the lake houses and The Medallion architecture is so different from what they’ve been doing traditionally with Sequel and the more I look at it I’m going and the more again I came from the big data science the science the data science realm so I’m very comfortable with this technology I like it I I think it’s it serves a lot
57:34 like it I I think it’s it serves a lot of needs here I didn’t come from the SQL DBA world and the more I’m learning about SQL dbas there’s so many really solid data warehousing Concepts that we’re just repeating them again inside the Lakehouse that the only it’s still data warehousing it’s still the same technology it’s still bring in raw data it’s still stage data it’s still transform it into final tables none of this has changed and Donald on the chat here he’s a big advocate for Kimble right Kimble is the data
58:05 Kimble right Kimble is the data warehousing expert or the the guidelines on how you build this stuff there’s no change to any of that all of that stuff still stays the same there’s no difference in lak houses versus sequel if you’re going to be doing a a Kimble schema or Kimble principles on top of your data warehousing it’s the same stuff so for that reason I don’t really think it’s that different I think that I think the terminology has changed the technology in the hood is different but at the end of the day you’re still data warehousing it’s still the same stuff all right Tommy with that I will say thank you very much for listening to
58:35 thank you very much for listening to podcast we’ve talked a lot about this one I this went very fast yet again I’m very pleased with the topic here today thank you for listening our only ask for you is to make sure you share this with somebody else and don’t forget if you’re going to go to the fabric conference in Las Vegas there’s a discount code in the description of this video Below in case you want to get $200 off your tickets that can be reused so if you have multiple team members on your team who also want to get $200 off feel free to share that with them as well it’ll be in the description of all of our videos your OPP I was to say your opportunity to say to both of us to our face I
59:05 to say to both of us to our face I actively disagree with your statements yes exactly all right Mike you can find us on Apple Spotify or wherever at your podcast make sure to subscribe and leave a rating it helps us out a ton do you have a question idea or topic that you want us to talk about a future episode head over to power. tips podcast leave your name and a great question again another mic gives you a little extra points join us Tuesday and Thursday at a. m. and join the conversation on all power beata tips social media
59:35 all power beata tips social media channels excellent with that being said we appreciate you so much thank you so much for your time today we will see you again next time on the podcast we’ll have a recorded episode I think coming up next and then after that we’ll be live again so that being said thank you all for listening and we’ll see you next time next time [Music]
Thank You
Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.
Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.
Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.
