User Input Tables – Ep. 332
In this episode of the Explicit Measures Podcast, we discuss user input tables and what it means for Power BI developers.
News & Announcements
- Microsoft Fabric June 2024 Update | Microsoft Fabric Blog | Microsoft Fabric — Welcome to the June 2024 update. Here are a few, select highlights of the many we have for Fabric. Fabric Spark connector for Fabric Data Warehouse in Spark runtime is now available. Query activity, a one-stop view…
- Redirecting… — See link for details.
- PowerBI.tips Podcast — Subscribe and listen to the Explicit Measures podcast episodes and related content.
- Power BI Theme Generator — Create and download Power BI report themes using the PowerBI.tips theme generator.
Main Discussion
User input tables are one of those requests that sounds simple (“let users type values into the report”) but quickly turns into an architecture discussion. The episode walks through common approaches and why “writeback” needs to be treated as an application feature, not a visual trick.
Key points:
- Clarify the use case: ad-hoc scenario planning, approvals/comments, data correction, or parameter-driven exploration all need different solutions.
- Options range from what-if parameters (no persistence) to Power Apps / Dataverse (persistence + security) to more custom writeback patterns.
- Governance matters: who can edit, where the data lands, auditing, and how it affects downstream reporting.
- Keep UX realistic—Power BI is a great analysis surface, but not always the best data-entry UI.
- If you do implement input, treat it like a product: validation, permissions, and a clear lifecycle.
Looking Forward
If you’re facing an input-table request, start by deciding whether the inputs must be persisted and audited. That answer will usually tell you whether you need Power Apps/Dataverse (or another app layer) versus a lightweight parameter-based approach.
Episode Transcript
0:18 you [Music] good morning and welcome back to the explicit measures podcast with Tommy Seth and Mike good morning everyone good morning good morning Tomy was just there’s a couple little just intro things here but before we get into our our topic
0:48 but before we get into our our topic today today we’re going to pick up an article from oh I forgot the name of here I don’t see the name on here Dax Noob I like the name just Justin Martin Justin Martin I didn’t know Justin Martin’s blog was tax new but actually I love them you’re right Tommy the website is very very fun and a good name there we’re going to talk about today we’re going to talk about enhancing your golden semantic model with user input tables so models with user input tables let’s just call that I
1:18 user input tables let’s just call that I love how everything has now gotten like a thousand words longer just because the word semantic model is now in there as well it does yeah what was it before T models model 20% of my training slides are now changed to semantic models but I still have like 80% to go when you when you are having when you’re talk about data too much when your your whole family teases you about the word data you just say the word data all the way all too much data data data
1:49 data data data all right Mike I need to give you some right before right as we kick off I need to give you some Kudos some credit here and something that Prem incom that actually feels hurt it is painful to do but Credit Credit Credit word credits do I’ve having a lot of conversations with a few clients and they are switching everything and they Tred data flows Gen 2 and they’re switching the notebooks and they’re loving the experience telling you you
2:19 loving the experience telling you and because they’re like they love know and because they’re like they love the user interface of data flows Gen 2 but that’s where their fires start it’s basically if you were just starting to do electrical wiring like it’s really easy to do it this way but it Sparks a lot so that’s basically gen two for a lot of people so man I’m telling you too I you converted me I was completely on the data flow train still love data flows I still love it this is we are weird following now it’s it’s like your first comment is I’m not using
2:49 like your first comment is I’m not using it but I still love it right like that’s that’s my comment like I don’t want to use it but I love it like it’s like family like my third child yeah yeah extended it’s like extended D it but no notebooks man are I’m I’ve been switching some of my own processes there too so good M notebooks notebooks are the way to go and and it’s a good blend of so why I love notebooks so much for those who are not involved with notebooks if you’re not you should definitely start playing with notebooks
3:19 definitely start playing with notebooks getting involved with spark the reason I really like notebooks it’s a good blend of here’s a little bit of code that’s easy to read and then you can see outputs regularly so do it thing see an output do a thing see an output do a thing see an output and from a collaborative experience Tommy you’re doing a lot of things by yourself at this point but as soon as you get to the collaborative experience around notebooks oh man it even gets even better two people can be working on the same notebook writing the same code together running the same execution off the same engine like that’s amazing yeah
3:50 the same engine like that’s amazing yeah and and take it from me too I like I I feel like I’m a pretty good use case because two years ago really wasn’t touching notebooks because I had no need and it’s been a very self taught thing Y and especially now in our day and age with co-pilots and AIS and all the things it’s it’s something that becomes very it’s very seamless for me now so yeah Credit Credit Mike there we go we’ll give you the three I got one thing right there was no maybe maybe Mike you’ve been right all along and it just
4:20 you’ve been right all along and it just took this long for Tommy to realize that maybe face he’s like what no I was say that goes hand in hand with most of my experiences you just rocked you just rocked my rocked my world Seth you came out with an interesting article here which I I like what you’re going with this one so this is a great article I’ll let’s that do it here so yeah I’m on this one I’m gonna jump in so you don’t steal my thunder here I’m doing this right now it was uncanny Jean Hayes senior customer
4:53 was uncanny Jean Hayes senior customer engineer aszure FasTrack Microsoft posted on LinkedIn yesterday so a mere 17 hours ago around how she has created a scenario where she plugged in pageat reports on top of a six and a half billion row table and got results in like a like a second so Mike post post the the the little blur or she wrote it up on the
5:23 little blur or she wrote it up on the tech community and outlines even her code and how she went about doing that with compacting using Z order and then three parameter tables pass in the parameters from the report and the results are astounding on fabric so Mike I know I think you want to talk about this but this the the recent advancement with paginated to easily connect to even data bricks has been a an unbelievable change
5:57 bricks has been a an unbelievable change game Cher game Cher in just in just producing simple like data sets for business to get access to immediately and it it literally takes like 5 to 10 minutes oh you want this report oh you need access to this yeah let me me done yep it’s it’s pretty crazy and I think the challenge here is the challenge is you need data bricks to materialize or fabric to materialize billions of small transactional row
6:28 billions of small transactional row level things that people want to go dive into and go get and so you’re not going to these are things let’s just be clear this is stuff that we’re not rolling up large aggregations across lots of table rows here this is I need fine grain information about really wide long big tables and as a user I’m trying to pick here’s a couple columns that I need here’s the date the date range I need the information from but it’s going to be a very large big table and it needs to produce those records quickly
6:58 needs to produce those records quickly users going into that aren’t asking for all billion records to be seen in in the page report they’re looking for like a section a window of that information call it like a 100, 000 rows of data or less maybe something like that this is amazing and so the reason I’m I’m very excited about this article because literally I spent probably six hours in an afternoon just really beating up ideas about things and our semantic model is getting complicated we’ve got really really interesting things going on there and so I with engineering I’m sitting down and saying
7:29 engineering I’m sitting down and saying what’s better way is there a better way to do this and we just started beating up ideas and this idea came up and it was weird because within four hours of us deciding we should try this we should see if this works performance- wise because I haven’t done a six billion row page report yet literally the article comes out within like hours of us making the decision like this is a thing that we should be thinking about and so I’m like wow this is amazing so this this actually might change some of our our recent architecture just because this is such a very interesting way to go about
8:02 such a very interesting way to go about getting to the details of the information yeah and to be clear powerbi is great at visual visualizing things it is not good at data dumps there is no data dumping portion of powerbi. com and when we talk about the needs of the business there’s always this need to one see insights of data tell me the answer of the data aggregate it show me the the aggregated form let me compare it during different time periods okay then there’s a other side of it is I just need exports of data I just need access to
8:32 exports of data I just need access to the information and to me this seems like this is really feeling that need of I just need access to lots of information yeah and typically we don’t typically this is the new section but I like where this is going and I’m going to lean into it a little bit what what’s what’s interesting to me is is especially as as Jean is out outlining the development of structure of the underlying table and how she gets the parameters to look this up very quickly I wonder if there there’s
9:02 quickly I wonder if there there’s there’s a thought in here right where powerbi is that fantastic aggregation tool right it’s not data export but I think to refine what you’re saying is yes there’s a need for data dumps but there’s also the need for okay you’ve given me the aggregated view I’ve been a through these billions of Records been able to see something in the data and now I want to go look for that thing and this option on its own
9:32 that thing and this option on its own allows me to do that but I wonder if there’s a a a a path in here to knit these things together in some way in that report where I can go go now flip into potentially like an embedded like Vis paginated and say yes I’m GNA pass in because I found these things in the aggregated data and I don’t have to maintain obviously that level of granularity in the in the model because I can’t yes but now I have a path
10:02 I can’t yes but now I have a path towards sifting through the six billion records in a really efficient way to return the Thousand that is really compelling yes I think that’s and that’s where I’m I’m getting to the point of like that you need to see all the data and one thing too I’ll be very clear of for reading
10:21 too I’ll be very clear of for reading this article one it’s really awesome good article you’ll also know there’s I’m not going to say it’s a trick but what’s also happening in the article is we’re talking billions and billions of rows but the parameters that are used are year and month and then they’re also using a vendor flag so the vendor flag so I’m the the the setup of the data is actually very smart they’re taking the New York Taxi data and they’re taking the New York Taxi data and multiplying it like three or four times so it’s a like a really big data
10:51 times so it’s a like a really big data set which is awesome but the New York Taxi data set it has dates and it has this vendor thing so the date are pretty much broken up by partitions when you look at the original data set so by month you have like a park file by month so if you think about like you I have six billion rows of data but I am looking at information that’s coming out of a particular year and month is which they built the report by that’s really what you want to be doing you want to figure out what is the right combination of required parameters that the user needs to not list them millions of
11:22 needs to not list them millions of Records to get them down small enough where they can say okay it’s it’s only it’s a reasonable size of data cuz to your point set it’s not relevant for you to download a million records no one can even do anything with that in their own computers computers yeah yeah excellent anything else we had the from this well I was just going to say any other topics for the page report piece there’s other news I just want to move on yeah yeah sorry that was that was not clear my bad pin
11:53 that was that was not clear my bad pin reports so there’s a there’s a Blog announcement for I guess it’s the June update but before we get there Tommy you had another question around streaming is anyone using the streaming analytics portions of fabric yeah so wanted to ask you Mike right before we went online just going we always check in like hey are you what are you hearing from your clients now kind like what the beat on the street so to so to speak or having that be a new segment on the street there you go is the on the
12:23 is the on the street okay I like that new segment welcome to beat on the street what are you hearing from your clients and I okay I like that some feedback so come up with this stuff live people this is real this is real time thinking right here this is because we don’t plan beforehand the only episode the only podcast that you could get real time new segments as we build them in real time this is great this is this is realtime streaming analytics right here you want to hear thoughts for the first time no so we we always talk about like what are you hearing from your clients what
12:53 are you hearing from your clients what are they asking for what are they enjoying and I I I just had that’s where actually the data conversation came in someone was switching and it’s like okay there’s a lot of updates in the June 2024 update for fabric around event flows of the event Hub realtime analytics and Mike I was asking you I think you said that you had something interesting around this around a real-time streaming yeah so we are working on so a couple things
13:23 we are working on so a couple things like so anytime I have a conversation around streaming it usually turns into that’s really nice that you want streaming streaming what’s the the core question I ask is what decision do you need to make in the time that data shows up from now till 10 minutes from now is there a decision that you can make right now that changes the impact of future data points that’s a very technical way of saying the question but like what can you do right now on what how does this data provide you action immediately there are definitely positions where
13:53 there are definitely positions where this does make sense was working for an airline they’ve got real like they they have real time needs for data right if that door doesn’t close at the certain time that plane’s losing money on the runway like there’s there’s a real cost implication for like what’s going on someone needs to be notified that there’s a block in the system so that that I understand but sometimes teams say hey I need the real T we if we had the data faster if we could get it in quicker we need that data faster in our hands and I I said my question here is is it worth the spend you need to
14:25 is is it worth the spend you need to really evaluate the value to the opportunity of having that data in quicker not saying you don’t do it but also saying if you need to have any kind also saying if you need to have any streaming thing streaming stuff of streaming thing streaming stuff typically means there’s some computer that’s on all the time or you’re building a little bit more of an elaborate thing around like a function based thing that’s listening and you sended information and it does a little quick processing and then it moves on so to all this to say is I was showing a client toming to your point I was showing the client pipelines and
14:55 showing the client pipelines and notebooks doing micro batches multiple times per day loading things loved it they thought oh this is easy simple makes sense we get it we understand it batching and micro batching understood as soon as we moved into the streaming realm data activator event stream stream events all the things started not quite fitting together quite right and so it was much more difficult to build a solution there was a lot more clicking around we we
15:26 was a lot more clicking around we we were doing things in the UI that didn’t quite make sense so it yes it’s probably out some of it is still in preview some of it’s not but it felt in general rough around the edges and I actually had a candid conversation with the Microsoft team I said look this is not quite where I expect it to be here’s some things I would really like us to change to kind would really like us to change to adjust the direction of where we’re of adjust the direction of where we’re going with with streaming it needs to be smoother one of the challenge points I had was working in pipelines and trying to get a real-time trigger to run in a pipeline this is very common Files come
15:56 pipeline this is very common Files come in file needs to be processed I need to pass file context information to a pipeline that then executes some load files write files into Delta table somewhere else this is a very common pattern but it wasn’t very smooth to get the triggers to work on some realtime a venting showing up from fabric so again in my this is I think this is maybe where it was I had a vision of what I wanted to build based on prior Azure things and then what
16:27 on prior Azure things and then what Microsoft was producing might not have exactly aligned in my mind the event stream things that I’m also looking at now you can do this is very technical you can get a event to show up you have to use this thing called a shared access signature SAS I think is what it is or shared access key like that and that that basically says hey I’m going to send a message to this location but it’s going to be it’s going to be secured so that way random messages don’t get in the ones that you want come in they
16:57 the ones that you want come in they don’t have basic Authentication username and password most web hooks have something like that so to me my thought is you there’s some basic functionality that I think other people need in these services that are just not there that’s causing you more headache and have it to build slight workarounds and it just doesn’t work smoothly so all this to say is I feel like there’s two challenges with this streaming thing it’s more expensive there’s value you got to weigh the value yeah and then
17:27 got to weigh the value yeah and then Tommy to your point there like I think you’re going to go with this one was it’s getting close but isn’t quite there yet well no I was going to say too like they have some great features but to your point we need the simplified thing this is like building a car going look we got apple apple drive we have a great AC system it’s like yeah it doesn’t start all the time yep so yeah well I I I think this is this is where you’re you’re more depending on Technologies right it’s the bleeding the bleeding edge it’s it’s where you’re pushing the bounds
17:58 it’s where you’re pushing the bounds like it’s brand new like things are getting put together and there certain use cases MH have have gaps that you have to work around or you you can’t right and then you got to revert to to something else not not that it’s not going to get there correct it’s just it’s new or there’s certain things certain features that they still have to have yet to push out If you think about this stuff as mean if you think about this stuff as well we’re moving more towards a cloud infrastructure period all the more of the applications
18:28 period all the more of the applications you’re buying are going to be Cloud facing yeah to your point Seth I think right there like and Donald actually in the chat Donald you’re amazing you say great things on here all the time in the chat so Donald’s talking about in in the the 80s it was called decision support systems right that was what we’re doing here if you needed real-time data you go back to the source system and ask for it from that system right that’s the real time data you would get but I think as we move more towards cloud-based systems there’s still this need to get data as it occurs and so what do you build well you build
18:58 and so what do you build well you build like web hook and and the cloud system says hey instead of giving you the only the piece of data that changed I’m just going to send you hey this record change here’s the whole record you do with it what you will so I think we’re going to have to be more I feel like there’s a trend that I’m starting to see where these Cloud systems are trying to integrate with your internal data or give you information and as they’re doing it they’re giving you web hook like listeners streaming of event data things and say look as we change the
19:28 things and say look as we change the data we’ll just send you a little message that’s cheap for us to do as a cloud provider now it’s up to us on the other side how do we build the system to catch it how do we make sure that data matches what’s in the system in the cloud so you’re posing it two interesting situations I’m seeing where there’s the analytical side of this but there’s really more of the action side so two tools that I use all the time are zappier and power automate yes right and yeah it may not right for a billion rows but for the
19:58 not right for a billion rows but for the like you said event based actions where it is some of it’s my records of tables if if I’m doing from things billable a bunch of things kick off correct right but that’s really what I needed for to your point airplane airports Supply chains it security you need that’s that really that’s more real time yeah totally right but there’s a lot more where the use case is action based yes and I think to your point you’re that’s a great Point Tommy like security is another great one right if someone’s
20:29 is another great one right if someone’s computer is getting compromised those are the things know now we’d like to know as soon as possible like those are those are to me things that make sense to be real- time Eventing things like and then how do we get the question then becomes for me is when the event happens how do I get it into the right person’s
20:42 how do I get it into the right person’s hands as quickly as possible so they can go take action on something go talk to the person go shut down the computer do whatever the thing is that makes sense it’s like real time really is the answer you want is or the question that should come from real time is and then what and then what’s that what’s that little Kid meme it’s like and then what and then isn’t there like a little Meme and and then and then what isn’t that like a YouTube meme or something like that there’s a little kid that just keeps saying maybe it’s in a movie I don’t
21:13 saying maybe it’s in a movie I don’t know and then and then and this is also real time conversation there you go we just went went there now we’re gonna what’s interesting I guess to me as you guys were talking is systems in general output tons of data all the time when when we’re talking about real time sometimes that is you about real time sometimes that is it it’s almost just where where are know it it’s almost just where where are you picking up the evented logs to make meaning of them but
21:44 evented logs to make meaning of them but but real time data is happening all the time it’s just a matter of whether or not you’re analyzing it and that’s where the true cost comes in from you’re plugging into that stream of essentially the stream of logs right that all these systems are generating outputs from so yeah exactly yeah very cheap very cheap to have all of those systems just running and outputting things not necessarily cheap when you have to dip into that stream for specific things yes and and
22:17 specific things yes and and then put them in a location where where you’re monitoring only those aspects but there are a lot of other tools that do that right like correct that you can you can leverage to expedite and and make that a che and you’re speaking about the distinction I think between action based and I think what Microsoft’s really doing with fabric is it’s real time intelligence and that’s what they’re really trying from a feature point of view to do not just obviously capture the data but also the intelligence around it I felt that that was a huge message at Bild this year there was a
22:48 message at Bild this year there was a lot of focus on real-time streaming analytics and getting your data is in motion it’s always moving it’s always happening I’m not sure the business is quite ready for that I think maybe there’s a a request for for it but I think if you have a number in the morning that’s different than a number in the afternoon that confuses people honestly too too fast well like think about think about the conversations like many of the conversations are hey in the morning I’m going to go check the sales numbers for whatever right and then you have to build a presentation
23:19 then you have to build a presentation when you build a presentation the numbers don’t match right like it ruins the story of unless everybody think about what that would mean from a data culture perspective everybody would need to know that the systems are so plugged in that that number should change on a daily basis or hourly or whatever the case may be right yeah so I think I agree I think I don’t think the wider Organization for normal reporting or Insight based purposes really need like
23:52 Insight based purposes really need like would key in or be it wouldn’t there’s not a ton of value for like real time everything all the time I would agree with that one so the internet never fails apparently this is from dude wears my car and there’s someone in the in the speaker talking about then and then and then that’s yes yes I I’ve never honestly I’ve never even watched the movie I just know the clip that’s all I know yeah I know that’s funny talk about okay well thank you internet for for picking up that one so mark thank you for the
24:23 that one so mark thank you for the credit on that one you knew the the movie quote that I didn’t even know and I’m quoting it so I should be careful about quoting things I don’t know excellent a we have we have an opportunity here we have one more thing we can go through the monthly blog on Microsoft or we can go into our main topic we’re about 21 minutes in 24 minutes in we probably should just transition into main topic all right main topic for today let’s go there since I didn’t hear any convincing arguments one way or the other let’s jump in so the main topic
24:54 other let’s jump in so the main topic for today we’ll be talking about our dat Dax noob article and we’re going to go through just talking through what did these extra user based tables or what are they going to be looking like inside are data models Tommy take it away give us some enhancements here about user input tables I am looking forward to this conversation with you guys and really I think how you’re going to take this but that’s all conversations actually you got you got that deep tone
25:24 actually you got you got that deep tone I’m sorry go ahead I’m gonna sit back and listen continue Tommy so I’m trying a little more podcast take on this oh yeah no yeah all right hey guys I’m not gonna do the radio in the morning yeah so like with sound effects but no so let’s go through the of the article and there’s a very intriguing argument here when we’re talking about user input tables what what Dax newb goes through is really first how important that gold semantic model is but can you incorporate here this concept of a user
25:55 incorporate here this concept of a user input table and really what is that it’s a table that’s to use to capture the selection of a user and use that selection to modify the result of a measure somewhat very familiar to your what if measures but this is different because you’re also now incorporating a lot more of any user going in and really affecting a table affecting the measure and affecting the numbers really he goes through some great Concepts or great approaches utilizing Dax to do so and I think for us to take it
26:27 so and I think for us to take it up a notch and be goes through some suggestions for time intelligence for parameter field for a date Helper but I think this goes part to our right back scenarios but also part to the flexibility from the consumers to have not just the ability to filter and interact but also modify and really Choose Your Own Story from a report point of view so let’s start there on the benefits and really have what have you seen with user input tables what are
26:57 you seen with user input tables what are your guys’s first thoughts when it comes to allowing the user to proactively adjust the numbers in a report well let’s let’s maybe I agree with that Tommy I think it’s a great summary of the article and where we’re going I think I’d like to just quickly Define a user input table and just make sure we’re very clear on like what we’re talking about here as well because I want to I I know what we’re talking about but I want to make sure our users are also on the same page U think of a user input table as you have a report page and you need a user to enter in a piece of data
27:28 user to enter in a piece of data somewhere in a a field a box a window where you are it’s a a zero to 100 number that you need to add and then that addition or that number it’s call it maybe like a multiplier right you’re doing some math on some some data that’s coming from a model and you want to multiply by some Factor right a 2X factor a 3X factor or or a number that goes from zero to one something along those lines right that factor that needs to be built you can’t just enter in a number so what this is these are very tables that are
28:00 very tables that are predesigned they are loaded into the model before users can use them however they’re useful in that you can leverage them inside the context of the model so that’s one of the the that’s one of the differences I see between powerbi like other bi tools there is no input Field Box right you can’t have and I think this would be to me this would be very simple I think to implement but I don’t really know why it’s not already there why isn’t there just like like I’m going to have an input box that is a number
28:31 to have an input box that is a number that can range from any number between zero and a th I don’t need to store every single combination of that number I just need to have a number and maybe that’s what I used to multiply by a measure or some other calculation inside the model to me that doesn’t necessarily always make sense there seems to be other tools out there that will let you literally have an input box an input field if you want true input field things you can go to like a daab visual which does give you true user input controls where you can type in actual numbers and then those numbers are
29:01 numbers and then those numbers are represented as a number and then it manipulates that visual so there are other tools that allow you to do that did I do a good summary there set that I miss something no I I think your summary is good I I think what is potentially confusing or even like that I want to clarify right when we’re talking about a user user table it what what where my mind goes is the individual user all the time and that’s not really what this is what these it’s user input tables but these are stand these are standardized trying
29:31 are stand these are standardized trying to standardize or create standards of reusable user inputs functional things that people can leverage yes when building reports on this model right that are that have that flexibility of the areas that you’re talking about that that almost provide an input to the report or methods by which they can easily extend or modify the model without all of this other stuff as composite pieces in their own model Etc
30:01 composite pieces in their own model Etc right so extending the user experience where I should say the development experience of users and and calling them user input tape like but it’s not a specific user thing every single time coming in well this is intriguing too because this is not a feature or a functionality that you can just randomly publish out to your to a report because no one’s not going to know what to do with it to your point like yeah it may not be a
30:32 your point like yeah it may not be a builtin feature to powerbi so to speak we it’s like build like a standard visual but I think there’s a reason for that because if you were let’s say you had a scenario where you were just like hey I have a nice idea for a user input table well there has to be some training around that or some support or some education this is not something I I don’t think anyone’s gonna intuitively just say oh I’m going to interact with this and this and their Co conversations like not going to adjust it for everyone
31:02 going to adjust it for everyone I don’t know what to do here so there
31:04 I don’t know what to do here so there does have to be here the need for what are we trying to do with this either there’s already been a conversation with your stakeholders or the consumer the consumers not just stakeholders I I suppose right the out the out but really do do you need consumers do the consumers need to know that they’re making a selection and the filter context just works the way it is like this seems much more of a developer not a filter context if I’m saying if I have this functionality in a report slicer filter whatever they’re choosing
31:35 slicer filter whatever they’re choosing something yes like the implementation of this is much more on the dev side isn’t it agree implementation yes yes but not the but I think to your point I think this is I’m I’m seeing this as a lens on both sides of the argument here yeah I agree with you Seth I think you’re right on point with like this is the article is catering more toward developer saying here are some common patterns I ask how I’m reading it here are some common patterns you may want to use inside your models such that
32:06 want to use inside your models such that thin report developers or people building like there’s nothing that says you couldn’t use these things inside a measure that thin report users may want to use the challenge I think here becomes for me is how do you hide or unhide them for all thin report users so these are things that users may want to build into their reports but this is how you would set them up and here’s some common patterns right I’m am I I think I’m not articulating that very well I think it’s both sides of that story both developer and user side well so there’s I I think
32:38 and user side well so there’s I I think I there’s definitely that case here because you definitely have if it’s in the report itself in a thin report then there’s that consumer facing but I think Seth I see what you’re saying because if I’m building this in a gold semantic model which is the beginning of the article well then that has to be part of the model un then I’m just building a composite report which do you want to do that and this is where I think I get a little bit is that he’s adding this to the gold right and that’s I think that’s where we start do we actually want to do
33:08 where we start do we actually want to do this in a gold semantic model well maybe where I was going with this one I think some of these I use more often than others okay right so a list of zero to one in 0. 01 increments I probably don’t use very much a list of 1 to 100 in increments of one I don’t probably use very much however selecting month year-to date year month to date those things I do
33:40 use so what are some of the scenarios that you’ve actually utilized a user input table whether it’s go or semantic and for me you actually it’s funny you say you’ve used it for the date I’ve used it a ton for ratios man to say like hey if your conversion rate let’s say just oh yeah well that might be that might be your industry right that’s that’s a lot of marketing stuff yeah or really sales rate open rate close rate I’m gonna have a thousand Impressions my conversion rate is X and if I want to dial that up or down then that’s the ratio that you’re looking to build so I think in that situation that would make a lot more sense to have that as a ratio
34:11 a lot more sense to have that as a ratio and something I’ve done it I I’ve actually found a great success with that where it became like a one of the requests like oh could you actually add one for this x- rate we because we just want to see how that would adjust over time sure and yeah so that’s been a huge one for one for me I will put the cave out it was never in a gold semantic model so that’s a good point that’s a good point there I think that yeah this is this is maybe where I’m going to debate this article a little bit is does
34:41 debate this article a little bit is does it make sense to put it into gold because this do putting things like this into gold has two implications I think one it takes you a little bit more technical know how to get it in there correctly get it right yeah and you and you got to make sure that whatever you’re putting in is something that would potentially be used by other people cuz you put things in Gold models going be very difficult to remove them because you don’t really know how you everyone is using them all the time so when you put in the gold model it could be used by measures in thin models which we I think we can go get but there’s not real clear lineage of does it exist so
35:13 real clear lineage of does it exist so I’m of the opinion your gold model should be as lean as Mo as possible because it’s going to be way harder for you to remove things from there than it is for you to add stuff so the the net gain of this is what less composite models less people using the base of the golden and then having to add in all their other stuff really a good other side of the coin I think so it’s not the other side I think that is you’re try like I I guess what’s interesting to me when I say that then though is this
35:45 when I say that then though is this isn’t like the net gain for the business or developers would be okay simplified access to this golden data set I can go build a report it would require and I think he lines it well right like there are some high level ones and you give some examples but a lot of this could be business need report specific right I need to go build this report it makes sense that to Tommy’s point we’re Finance we’re or we’re commercial and we always do conversions of whatever
36:16 always do conversions of whatever because we’re a global company if this is just going to be a single report and all of our people across the world are going to be looking at this they they should easily be able to select the conversion rate into what most meaningful makes a ton of sense the the the point that sticks out to me though as we’re talking about this is there is a cost here so while you’re trying to avoid composite models the what you’re doing is you’re you’re modifying the measures of the golden
36:46 modifying the measures of the golden data set like the golden model and there is an impact in that like where you start mucking around with like it rather than just doing a simple sum if it’s some if blah blah blah blah blah blah blah blah blah blah every time that measure is used there’s a processing cost that you’re incurring then on the golden model and I I would be worried about like the performance of that as you would Implement more and more and more of more of these this is this is a interesting one because I’m really
37:16 interesting one because I’m really jumping hopping fences on this one going back and forth on which way do I lean in M or Seth this is a great point because yeah we want to try try to avoid I think you would discourage composite models not necessarily turn it off but say try to avoid them but to your point you start adding this to your measures and I have my gold metric and now I have a metric that’s named similarly for my what if parameter or for this case well
37:46 what if parameter or for this case well who’s to say and it goes back to Mike who’s using this and are they using this responsibly right are they are they using her correctly because yes that and that was my second point I was going to say earlier was once I put these things in there needs to be a level of Education to all of the report developers to make sure they understand here’s what’s in this table here are examples of how to use it and if you’re going to join this stuff and I thinking about the example here about like the
38:16 about the example here about like the year month date table thing I I actually do think that one’s very relevant because a lot of times I get request for I just want the year to date I want a month to date these are just quick selections like they’re trying to select a range date things but because the default slicer or the slicer mechanisms that we have to filter things are not sufficient all the time it’s like I just would be easier to click the button it’s like I clearly see its month to date boom that’s the button I want so I think those are some enhancements that you
38:47 those are some enhancements that you those are some enhancements that the it should come out of the core know the it should come out of the core tool but it doesn’t always come out that way so sometimes we’re doing these extra things to make it easier for people to use as long as your report developers understand it and you have ample training and guard rails you have some guard rail like you just tell them how this is working I think you’re you’re pretty safe here I’m going to go on the other side of the coin here and the benefit here or the the case for adding this this semantic a gold semantic model or utilizing them at all I I don’t I if we were to pull our everyone listening if
39:17 were to pull our everyone listening if whether they even added a user input table to a report in general I don’t know what that number would be and I think it would be honestly relatively low I think it would be less than half there’s a lot of benefits for doing this that I think can accelerate data culture and and the adoption of of people utilizing powerbi because a lot of times I think the argument with powerbi when they’re coming from Excel or something like that where is there is no there they are now purely a consumer like
39:48 they are now purely a consumer like they’re watching TV and they have no ability to now have that conversation with their data to adjust it and I think there’s really an acceleration where they actually have the ability to modify and really in a sense touch their data to actually actually ask questions where the numbers can change in in a positive way because they actually want to see like hey if I increase my my my uho open rate of getting new customers by just half a percentage look what that does for my bonus or look what that does
40:18 does for my bonus or look what that does for my team and that becomes a conversation that they can have with their team on what can we do to increase X Y and Z compared to just here’s our trending here’s here’s our kpis I I would I would lean more towards encouraging user input tables not necessarily for every report but I think there’s a lot more situations than we think because this is very much a feature that users don’t know it exists so they’re not going to ask for it if they don’t ask for it we’re not going to do it but it should
40:50 we’re not going to do it but it should there’s a lot of cases where I think they would love it I agree with that I think I think there’s but I I see things like field parameters and being used is like more more very specific designed scenarios it’s it’s planning it’s projecting it’s estimating like I feel like these things are more towards the I have data that I understand it’s historical values and I’m trying to figure figure out possibilities or combinations of possibilities that could happen in the future scenario planning a little bit right I think that’s
41:21 little bit right I think that’s that’s where I feel like most of these examples come from the one that I think
41:24 examples come from the one that I think is like very real time in reports is again I go back the date one right that the the dates one seems to be very like consistent like you could use that in any report pretty much anywhere you want to go I don’t know if that’s given consumers enough credit on what they’re asking with their data Seth you you have a face here that I’m really really curious I feel like there’s some thoughts what mucking around here I hope there’s always thoughts mucking around but I’m just listening okay okay so I I I thought there was something very formulating a trapping question tomm
41:55 formulating a trapping question tomm Tommy Tommy when have you ever known me not to interrupt people or or share my thought yeah I all knew all too well so but well no I Mike I don’t think you’re giving enough credit to Consumers and really what they want right because again if people didn’t know a Bart existed us don’t know what they want we all we all have our Battle Scars I’m I’m I’m scarred from this because they don’t know what they want and what they ask for is like what the heck are you doing like let’s get you something that’s
42:25 like let’s get you something that’s realistic here you want one billion row export from Excel Excel file like or Excel table like not going to happen dude like that’s not realistic I completely agree but I think again that goes back to their own level of knowledge of the art of the possible I think I think things we’ve talked about in demos why wouldn’t you add this as one of your Demos in that art of the possible scenario I totally agree showcase it and really push it I I would agree one whole heartly that one but what I’m what I’m trying to get to is like in reality in practice is this a
42:57 like in reality in practice is this a good idea and I’m and I think in practice I’m not pushing as hard for feature parameters unless I’m having direct conversations around we’re doing forecasting I’m running an ad campaign and I need to see what the conversion rate and it may be different based on some factors or I’m projecting some things so I understand where we’re going I would probably push if I look at these different if I look at all the dacks that’s supplied here or the M or whatever they’re doing all these options are like four or five items long there’s
43:30 are like four or five items long there’s a couple there that are like 100 items long but none of these options are like these are not big tables that we’re applying here so to me it feels a lot more relevant to put these kinds of things in a composite model and let people build thin reports off of that composite model because we’re passing just a handful of small parameters back to an original model that then modifies some of those formulas again there’s always caveats with this like I don’t I if my Dex gets really complex and big in that composite model it’s going to run
44:00 that composite model it’s going to run slower but it feels reasonable to say that a lot of this could just be handed handed into a composite model and I could keep my main single model cleaner and and more streamlined as much as possible I yeah I I do I do like the article from the standpoint that I think this opens the door for op yes that people be aware of related to
44:32 yes that people be aware of related to the scenario where hey we have a we have this golden model that we’re using but I have this have this need and that that need is either going to drive me towards building my own composite model or my own model that’s a derivation of this Al be albeit being sourc from the same thing but the only reason I’m doing that is one or two things right and that’s where I could see like okay well we could do that you see like okay well we could do that it what are there any processing
45:02 know it what are there any processing costs like do we want to stand up a new model like we have we have more technical Deb if we have to change things like it’s or is this your thing it’s not going to be widely disseminated it’s okay to be a composite model or do we throw on one of these user input tables to the gold model and then you don’t have to do anything you can just create your thin report and that’s where I I like it as an option but I don’t think it’s like a carp launch I’m going to be like Yep this is the way to do it all the time I think I think it would be specific to the the dependent
45:33 be specific to the the dependent needs of what you’re trying to develop and for what audience I I would be liking to guess here that adding these things so I think the the the vibe of the article is these are here’s six tables that you just may want to add to every model because people may need to use them here’s here’s what they do here’s how to maybe use them I think it’s I I don’t know if I’m going to say yeah I’m buying into it I’m putting all these in every model now moving forward like I’m sold I’m going to do it however I do think the article is very good about highlighting here are very specific use
46:03 highlighting here are very specific use cases of where you would use these user input tables and why you might want to apply them into the golden semantic model and again I think it’s it’s important to make it distinction where we’re talking about gold semantic models we’re not talking about a requirement we’re talking about an option right so yes a requirement is you have to add this to every go this is things you may use in the future right so or may not like yeah and do we want to maintain that is that something we really want to be able to make keep track of in our model I would to I would absolutely put
46:35 model I would to I would absolutely put what we’re seeing here as an option for a gold semantic model and not necessarily a requirement because man I I’m thinking especially from an audience-based thing if I have semantic models that are more manager level roles they’re going to want to have this and it doesn’t need to be in a in a composite model but yeah I would completely not say this is this is something you have to see if this works for every gold semantic model yeah I think I’m on that same page
47:05 model yeah I think I’m on that same page as well Tommy I I’m pretty much in that place where these are and and also I think this is also a great point to like did this exists I think a lot of people don’t even know this exists and they’re like wow it would be nice if I could do XYZ things with this I think this is also very informative of these are are are patterns that you may be observing yeah are there any patterns that we missed here are there any other tables that you have made user input tables that do not exist here that should be should be included these are pretty good and I
47:35 included these are pretty good and I think from the parameter field the there’s the yes no one or the the parameter field rating that I found interesting that I feel like there’s a lot of scenarios I haven’t thought of that I’m thinking of now looking at this so there’s one where there’s a list containing rating values low medium low medium medium high and high and they can actually select the rating so rather than have like to your point rather than having a scale of zero to 100 which is always can be very ambiguous yep where everything is assigned basically some
48:06 everything is assigned basically some type of number that you’d say if we do great well what does great mean they or good or if you have that ordinal value rather than just a simple numeric one giving users that app like like feature and be really impactful especially again when you’re dealing more from a manager level role so who actually has the ability to change dollars that’s where this becomes really impactful yeah I would agree with that one and I would say to me it’s more
48:38 one and I would say to me it’s more of like the and this is probably not part of like this blog post or maybe this is something it’s more of like I’m I’m thinking more of the thin report side right the other the only other one I’m thinking that I would feel comfortable adding in here would be things like thin measures that are or thin report measures that are alluding to graphical shapes graphical items or that there’s maybe some like SVG type things that I would maybe potentially put in here that are to your point Tommy like I need some data to say this is the red arrow pointing down this
49:08 this is the red arrow pointing down this is the green arrow pointing up there are some of those out of the box shapes but you can add those shapes to your theme file and then you can call out those shapes using measures so there’s potentially an opportunity here to like add some of those things as well so that’s the only thing I could think of that was missing from here is potentially there’s some user inputed shapes or company shapes that we want to use maybe there’s specific things that you’re using in your company to show some things the only other one I’m thinking of it’s and again it’s more
49:38 I’m thinking of it’s and again it’s more complicated is like there’s now spark lines on things used to be that it had to be like those had to be measure driven conditional formatting yeah but that but now that’s that’s being so what has happened is you don’t need those as much now CU it’s now being built into more of like the actual tool you can you can add them to tables now as needed so it’s change the features have changed slightly but it’s some of those little like example SL Improvement of Life pieces that I would add from a visual standpoint I maybe add in here as
50:08 standpoint I maybe add in here as well but again that would be more company specific and they’re probably not as common as these are I think these are probably the more common ones all right let’s jump into our new segment Tommy’s got a hot take so yeah I’ve I think this might be a nice little segment and what we’ll do I think moving forward from episodes if you guys want to I think if there’s a need for it is I’m G to give a hot take hot take Tommy thing very Tommy hot take a very yeah Tommy’s hot take a it’s a very New York thing to do to give someone a nickname for example there’s a pizza place I go to now the guy’s from New
50:39 place I go to now the guy’s from New York and he’s like three pies Tommy kind York and he’s like three pies Tommy thing Tommy three pies yeah so you of thing Tommy three pies yeah so you get three pies every time you go oh I have to yeah it’s delicious thing pie life in in in Lumbard or GL so shout shout out yes so and it’s gonna be somewhat related to the obviously the episode of the main topic and I want to get your guys’s gauge your your heat check on one to seven seven you strongly agree one strongly disagree okay and this going to
51:10 strongly disagree okay and this going to be General user input tables I’m not going to say necessarily golden semantic models however creating and deploying a user input table once in your powerbi life is required to be a true powerbi professional you must have at least one report and shared if you want to say you’re really a powerbi professional is this is this an essential skill this is good I think this is a good one I’m I’m struggling on
51:41 this is a good one I’m I’m struggling on this one is this I think and and to refr your question how I’m interpreting it is
51:45 your question how I’m interpreting it is this an essential skill to know how to do in order to become we would quantify like we’re we’re setting bars here we’re setting bars exactly yeah yeah all right well you’ve got a one through s scale Tommy of of how I how much I agree with this yeah this is how I how much I agree with this with this statement statement I’m probably going to so I also I’ll give you my perspective in the lens I also look at the world as going there are different personas of people and if you fit a certain Persona there’s a
52:15 you fit a certain Persona there’s a certain skill level that goes along with you right so to quantify my answer this to me is a data modeler skill so this is this is the Persona of a person who is a data modeler because we’re talking about implementing it into the model which then has to be then utilized in the report so I’m going to quantify this as a person who’s a data modeler skill set and then inside that skill set I’m going to say this is probably like a 200 or 300 level this is not basic building simple sums and and building some
52:45 simple sums and and building some general aggregations this is understanding that you can have a parameters list create it and then apply it correctly in the model so that you can influence it and there’s a couple advanced concepts in this article talking about we’re going to build a table with an inactive relationship I don’t think that’s level 100 I think that’s level 200 300 so do I do you need to be is this a professional I’m going to probably agree with you and say this is probably like a five for me this is something a skill like if you’re going to call yourself professional you should have some experience or knowledge of
53:15 have some experience or knowledge of using field parameters do I use this all the time no I do not so is this a core skill that I would very much make sure that how to do probably not so I’ll give like a five out of seven that’s still pretty strong that’s pretty strong I think you need to know how to do it I think there’s going to be what is the hot take question one to seven no what is the hot take question creating and deploying a user input table at least once in your career is
53:45 table at least once in your career is required to call yourself a B pro one one wow Seth wants to tune but no no here’s why here’s why in your hot takes you’re you’re asking a specific question which is you’re saying do I agree that in order to call yourself a powerbi expert you have to have created and deployed one of these no picking a technical skill out of the blue and time expert level to it or as far as like implementation isn’t valid
54:18 far as like implementation isn’t valid or relevant I think in terms of you like Discerning whether or not somebody’s an expert in powerbi and why is that it’s exactly what Mike was talking about depending on what your what your level of implementing things is typically tied to business needs and and like the scenarios that you’ve had to deal with now is that significantly different with would I say an expert should know about this absolutely
54:48 should know about this absolutely because it’s part of the base knowledge lexicon of how I would go about delivering a solution if and when there was a need for me to implement something like this so if there was never a need if there was never a need and I know about it I’m still an expert around it I know it’s an option for me to for me to implement but I’m not going to sayt there’s a stipulation that you have had to have created and deployed it so no one even if I change the wow
55:18 one even if I change the wow that’s well you can’t you can’t change the answer or the question it’s a hot take question you’re asking me to disag your specific question and all that’s why aot maybe in the future future maybe in the future Tommy this will just me words are important how I I get the feeling with this segment Seth is either going to be ones or sevens ones or sevens there’s gonna be nothing OPP but again they’re a hot take for a reason it’s not because I I’m not GNA it’s not my job to interpret what Mike
55:49 it’s not my job to interpret what Mike is doing in your question right which is probably more valid me I’m just hard yes or no man what are the words want to get to know I think listen to our hot takes Mike when he just explained his reason was such an engineering well I have to do this and therefore that’s a 200 200 does this and that’s a roll I’m get I’m giving you some context where the answer is coming from it’s a dumb question and that’s a one a dumb question one I didn’t say dumb I didn’t say dumb I said there was a specific question and I’m answer I’m giving you my answer based on
56:20 answer I’m giving you my answer based on your specific question that’s fine see I’m intriguing interestingly enough I’m gonna go with a four with a three here 43 okay 43 okay 3. 5 I meant to say three I meant to say two I no I meant to say one I ini wrot when I initially wrote this I was gonna go with the seven But as time simmered down I simmer a little this is this is a three for me I would really like to see this from a pro because it’s not just the ability to do it it’s the situations they’ve been in and that’s how I took
56:50 they’ve been in and that’s how I took this and if and if you’re if you’re the role of a like Al thr other things here too like I gave you the five because it was a five for someone who’s a data modeler I’m going to give you like a lower number for a report developer I’m going to give you a lower number for a report consumer I’m going to give you a lower number for an admin of powerbi like there’s other roles here that I’m like yeah it probably doesn’t really make sense so funny because I I wouldn’t see this from the data modeler I would want this from someone who’s a report author because that at least tells me they’re having good conversations with stakeholders right because going
57:21 stakeholders right because going building tables they’re just trying to get the pat reports as quickly as can but going back to our conversation if user doesn’t know about this they’re not going to ask for this so this is at least telling me that a report author or someone whose client or stakeholder facing they’re having great conversations they’re having interactive conversations so this came up but is it required at a report of be considered a pro it’s less are not just technical though they’re not really good at data flows yeah it’s I’m less likely to give it a higher number at a at a at a report
57:52 it a higher number at a at a at a report Builder level so intriguing well I think I like hot takes now I think this is all right let’s just create as much conflict on the podcast as we possibly can with Tommy’s random questions and Seth just being angry with the question because it’s not right this is great this is gonna be a wonderful segment that’s not anger that’s not anger okay that’s not anger just I I drive my point of view that’s right correct yeah we know all too well words matter words matter that’s what I love it I love it hey my final Thought Justin just started
58:23 my final Thought Justin just started blogging in January of this year keep it up great job thank you for the content that you put out and it was a great driver for a good conversation on the show today very much so awesome with that any final thoughts you want to wrap up here with no that was my final thought okay great nice job good job Seth which is why which is why words words matter I prefaced it my final thought okay sorry I must have missed that little word I was reading John ky’s comment I know what you were doing I was reading his comment trying to figure out this I need to say something about this
58:54 this I need to say something about this one so no I I think for someone who’s and I’ll go back to that someone who’s listening to this going should I try this out this should absolutely be something that you at least if you have a good stakeholder if you have a good relationship with one of your departments absolutely begin to introduce this and see if there if there’s some fruit there so man what what are we doing here the chat is just on fire today the chat is on fire today chat is on fire thanks John I appreciate your comment Seth is Seth has spoken I I feel like there’s another
59:24 spoken I I feel like there’s another like Star Wars reference that’s coming in here like there’s like am am I on the good side or the bad side people start calling me Emperor Palpatine or something no there’s a there’s a there’s a in the in the Mandalorian there’s a small guy that says I have spoken like he says his thing and and then at the end of it is I have spoken and like that’s the finality of like I’ve said it no more conversation on it we’re done I’ve said the thing so it sounds like Seth is saying Seth has spoken it’s not it’s not it’s not you asked for my tone
59:55 it’s not it’s not you asked for my tone Seth it’s all tone man it’s not the words words matter but so does tone yeah I don’t have any final thoughts on this one really other than the fact that this was a good article I like the topic here I think these are relevant but they’re not going to be used everywhere so use with caution that being said thank you very much for giving us yes this is the way this is the way exactly right powerbi is the way that being said we’re going to wrap this episode thank you very much for your ears for the last hour we really appreciate it I hope you had a lot of fun and and got a couple laughs out of
60:26 fun and and got a couple laughs out of one potentially or just got really angry at field parameters that could be also what the output of the conversation was so if you enjoyed the anger that we’ve induced on your Daily powerbi building experience please share with somebody else give someone else some frustration with these powerbi field parameters and let them think through this as well and how they want to apply these into their models with that Tommy where else can you find the podcast hey guys you can find us at Apple Spotify or wherever you get your podcast make sure to subscribe and leave it rating it helps us out a ton do you have a hot
60:57 helps us out a ton do you have a hot take that you want us to talk about we can make an hour out of that no problem sure we’ll talk about nothing for anything well we’re doing nothing right now for free powerbi yeah this is all free this is amazing guys we don’t charge a dime power. tisp podcast we got a form there go ahead and leave a question or hot take and leave your name leave your name because we wanna we also want to shout you out we want we wna you want to shout you out we want we wna make sure we call you out and say know make sure we call you out and say how bad your idea was that’s not true all yeah no we’ll all do it join us live every Tuesday and Thursday
61:28 us live every Tuesday and Thursday a. m. Central and join the conversation on all of powerbi tips social media channels we really need a of audio clip of Seth has spoken I have spoken I like the Execute order I like it it’s good thank you Seth I appreciate all your value that you add this is this is good set wi coming out such it has been such a good conversation today guys thank you excellent thank you all so much and we’ll see you next time
62:06 you [Music]
Thank You
Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.
Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.
Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.
