Getting the max from an MVP – Dan Olsen "Product people - Product managers, product designers, UX designers, UX researchers, Business analysts, developers, makers & entrepreneurs July 07 2021 True Product management, The Product Experience, Mind the Product Mind the Product Ltd 9550 Product Management 38.2

Getting the max from an MVP – Dan Olsen

BY ON

The product experience speaker

Want to make a veteran product person sad? Just ask them how they’ve used MVPs in the past.  Once you get past the sobbing, you’re likely to get stories of the myriad ways that a perfectly good experiment got turned into a v1 release.

It doesn’t have to be like that!

Dan Olsen joins us on the podcast to teach Lily and Randy how to actually control the scope of an MVP, guide us through some of his favourites, and why MVP failure can be considered a success.

Featured Links: Follow Dan on LinkedInTwitter and his Website|Read Dan’s The Lean Product PlaybookTypes of MVPs | Lean Product Meetup | The original MVP article How to successfully validate your idea with a Landing Page MVP  – and the follow-up four days later | A Landing Page is NOT an MVP | Aaron Walter’s book Designing for Emotion

Episode transcript

Lily Smith
This week on the product experience podcast, we’ve invited back our good friend, Dan Olsen.

Randy Silver
Oh, cool. I love chatting to Dan. He’s the author of the lean product playbook and a product trainer. And he’s just done a tonne of stuff in product over the years. He, he really knows his stuff.

Lily Smith
He really does. And last time, we covered the kind of tricky concept of product market fit. So this time, we just had to get the king of lean startups back in to cover the notorious MVP.

Randy Silver
Yeah, I see what you did there. Really, the MVP sure is notorious. But if you want all your MVP myths actually debunked, then just listen on.

Lily Smith
The product experience is brought to you by mind the product.

Randy Silver
Every week, we talk to the best product people from around the globe about how we can improve our practice, and build

Lily Smith
products that people love. Because it mind the product calm to catch up on past episodes, and to discover an extensive library of great content and videos,

Randy Silver
browse for free, or become a mind the product member to unlock premium articles, unseen videos, ama’s roundtables, discounts to our conferences around the world training opportunities.

Lily Smith
Mind the product also offers free product tank meetups in more than 200 cities. And there’s probably one for you. Hi, Dan, welcome back to the product experience podcast. So lovely to have you back in the studio.

Dan Olsen
Yeah, totally. Man. It’s great to be back. I don’t remember exactly how long ago it was. But it was fun back then. So I’m really excited to be back talking with you all again today.

Lily Smith
It’s definitely been a while and probably too long. But thank you so much for coming back to join us today at but for anyone who hasn’t yet heard of you, which I can’t believe there will be anyone listening to this who hasn’t come across some of your content? And do you want to give us a real quick intro into who you are and your kind of background in product?

Dan Olsen
Sure, sure. I’m happy to My name is Dan Olsen. And I’ve been working in product management most of my career. These days, I for quite a while now. I’ve been a product management trainer. I’m largely teaching private training workshops to companies. I also do public workshops, Speaker conferences, like mine, the product and other product conferences. And I’ve also done consulting in the past product management consulting. And besides, I started my career into it as a product manager, which was a great place to learn product management back in the day, and then work at startups before getting into doing my own thing as a trainer and consultant. And then two other things I was mentioned, I’m the author of the lean product playbook came out six years ago now a little over six years ago, which is kind of a guide to achieving product market fit. And it’s actually funny a funny fact, is the my editor reached out to me because his original thesis was to write a book on MVPs. And so I was like, Well, I’m VPS okay, but why don’t we zoom out a little bit and just talk about you know, how to build successful products and we’ll cover them up. So in the in the, you know, when they do a book, there’s always like a subtitle a second title, you know, so in the in the subtitle, we retain that MVP. It’s like how to innovate with minimum viable product and rapid customer. But of course, nobody refers to the subtitle. So anyway, um, yeah, certainly, they look. And then also, I, for a little over seven years now have built a product community called the lean product meetup here in Silicon Valley, where each month they bring in a top speaker, and a lot of the MTP conference, speakers have spoken at my meetup. And we were over 10,000 people now and with with COVID, we took it online. So it’s actually been awesome. We now have a global audience. So so I’m a big believer in sharing best practices and building community. So that’s, that’s the kind of things that keep me busy these days. Awesome. Thanks, Dan.

Lily Smith
And your point about MVP. So our topic today is all about MVP. So before we get stuck into the nitty gritty, let’s talk about sort of why the MVP is such a like hotly debated topic, and people get so obsessed, or whether it’s like a good thing, and whether it’s this or that or the other. And people seem to be very passionately on one side or another side or have very strong opinions about it. So what do you think it is about the MVP that has stirred up so much emotion in the product community?

Dan Olsen
Yeah, I mean, we have a lot of, you know, acronyms and concepts in the product management world, but I can’t think of one that is more divisive than MVP, like, you know, within the companies across companies, between PMS. You know, people just have different understandings and conceptions of it. They’re often very, very strong opinions of it. And for me, part of the evidence, you know, is you see different acronyms. You’ve got MVP, minimum viable product, then people didn’t like the fact that that was, you know, that sounds kind of like you’re cutting corners. It’s going to be a little shoddy. Let’s do the minimum lovable product right. So you know, there’s like people Making a minimum sellable product, you’re about minimal sellable product, minimum viable feature. Right. So anytime you see people kind of reinterpreting and creating their own acronyms, you know, there’s a difference of opinion and something’s going on there. And, and I remember, back in the day, you know, NDP got popular with the Lean Startup movement, Eric Reese’s book, right became much more popular. I remember around shortly thereafter, someone wrote a blog post, and it was called, here’s lessons learned from here’s your lessons that we learned, you know, using the landing page MVP, to kind of test out our test our hypotheses, you know, it was a perfectly fine article. And somebody does really got upset that this person could possibly think that a landing page could possibly be an MVP. And they wrote a counter post called, The title was just a landing page is not an MVP, like that was the title. They were so upset. And so I have screenshots of those. When I teach my work, I’d like put, and I’ll just do a poll with my audience. I’d be like, okay, yes or no, let’s settle this once. And for all, is it a landing page MVP, or not, you know, and usually get about, you know, 50% plus or minus 10%? voting, picking aside if you will, right. And it’s just so funny, because and the way that goes down is, you know, the hardcore people are like, of course, it’s not an MVP, what are you talking about? Don’t you know what the P stand for? It stands for product, product, a landing page in our product, you can’t use a landing page, it just sells you something. So no, no way, MVP product, landing page letter product, no, doesn’t compute. And then the more laid back people like, well, man, hey, anything you learn from is cool. It’s an MVP, why can’t why you got to be so judgy, about a man anything. So it’s a very, there’s a very strict interpretation. And there’s a very loose interpretation. And I’ve tried, I don’t think you can ever get those two parties to agree, there’s never going to agree. So what I propose in my book, and what I do is like, let’s instead of arguing, you know, black versus white, let’s elevate go up one level, let’s call them all MVP tests, or MVP experiments, right? Because that gives the hardcore people, the options like that you can reserve MVP for the hardcore stuff that you only consider a truly product stuff. But can we at least agree that a landing page could be an MVP tester experiment? And they’ll begrudgingly say, Yeah, I guess so. You know, so that’s the way out of it. Otherwise, there’s no real way out of it. And so, you know, it’s just, it’s that that’s the fundamental difference is does it have to be live product is the hardcore view, is the most extensive review, the softer version of that somewhat soft version is? Is it a representation of a real product, which a landing page still is not, you know, like, like a prototype or something. And then the most liberal, easy interpretations, anything that you’re testing a hypothesis, you know, of? That. So that’s kind of the spectrum of interpretations that we see,

Randy Silver
I think you’ve just given us get another acronym, and its MVP for the minimum viable. The acronym factory is cranking them out. I love it.

Lily Smith
So how would you let’s just to kind of take it down a notch into into some of the sort of the detail, I guess, for those who haven’t heard of the term? How would you sort of describe it to anyone new to the concept of a minimum viable product?

Dan Olsen
Yeah, I mean, the general idea, you know, sometimes people will say, What does MVP mean to you, and they’ll just say, a minimum viable product, my guests go, but what does it mean to you is like, the smallest, you know, amount of functionality that creates enough value for a customer, that’s, that’s kind of the generally agreed upon idea is, is that’s the idea. Like, you don’t want to build anything more than you have to, you could err on the side of building more than you have to that would be exceed the minimum viable product. You could also add on the side of being too too skimpy, you know, that’s the Bible. And you know, the customer gets to say, is viable, what’s viable, right? So you say, Hey, we think if we just launched this product with these five features, that’s going to be viable, and you launch it and they go, now you’re missing this key feature that I need, then that would not be viable. So that’s the general idea. And I think some important concepts, you know, is that it’s reduced scope. You know, it’s kind of like the anti waterfall mentality, instead of being like, well, we’re gonna launch the Cadillac version, or the Mercedes Benz version. It’s like, well, what would the Honda Civic version look like, right? And one of the popular frameworks is floating around out there. I think it’s from Spotify, where it shows like, how not to build an MVP, and it should building like a car, and it’s like, first to build a tire and then it’s like, the frame with four tires. And then it’s like, you know, the body. So the point of that framework is, hey, it’s not just about building things incrementally, because we get plenty companies doing supposedly doing agile, just doing waterfall in two week chunks, right? It’s that you want to create some intermediate milestone and sort of the big deliverable like waterfall, but that intermediate milestone has to actually create customer value, right? So that’s part of it too. So it should be reduced, reduced in scope. That’s like everyone kind of agrees on that aspect, the MVP. I really like MVPs that are also what I would say is a lower fidelity representation of what you plan to build. So not only are you saying okay, instead of doing the whole thing A lot of scope of all 10 features, we’re gonna see if we can get by with an MVP of these four features. Then when you do the four features, you don’t say, well, step one is build the four features and see what people think, for me about step one is let’s build a lower fidelity prototype of what an MVP with those features would look like and how it would work. And let’s test that. So there’s a kind of two distinct dimensions of an MDP being reduced, in some sense from the full product, you’re reducing the scope. And ideally, you’re also reducing the fidelity with a prototype prototypical representations. Now, even if you’re not doing the second, there’s still value in MVP thinking of reducing your scope. Right. And and one of the things I see when people err on the side of thinking of the building, is people thinking, Oh, yeah, I get how for it, we’re building a blue sky, Greenfield project, product v1, totally get that we got to figure out the MVP. But that’s like the only some people that’s the only scenario that they really resonates with for them. And the reality is, it’s a mindset, you know, whether you’re building version two or 1.1, you still have to answer the question, what’s just enough functionality to create enough customer value? And what would be too much? Because the general idea is sure, we can build more, but then we’re going to delay the launch, we’re going to delay the revenue recognition, we’re going to delay the learning and all that jazz, right? So honestly, it applies every sprint, every sprint, you know, good product teams like what really needs to be in this brand. And what can wait. Right? That’s really the fundamental essence of that question is what really, really needs to be here. And the challenges most people and the side of Yeah, let’s throw in the extra feature. And it’s subconsciously, it’s like, the more features we have, the higher the odds that we something the customer likes, and they’re the higher the odds of success are there. And it’s kind of like a cop out. It’s a pun, instead of being like, No, we really know our customers, we really know their needs, we really know that we really need these features to meet those needs. And these other feature ideas that sound interesting, aren’t critical, you know, so it’s, it ends up being about saying no, right, which is also like my favourite definition of strategy is saying no. And if nose seems a little too harsh, it just means not now, not for the milestone, not for the product release milestone that we’re talking about is sometime after that.

Randy Silver
So it’s easy enough to look at this and say, right, I can totally get this for the back end, there’s that whole thing of build something that doesn’t scale, prove that it works, prove that you need scale, and then scale it. So you can skimp on backends you can make something put in manual processes that you can automate later, things like that. But for the front end, for the feature itself, how do you know? What’s just enough? How do you know that if you do it, too, too low fidelity that people are just going to be responding to and saying, I don’t like it, I don’t need it. Because it’s not nice enough, it’s not good enough, even if it, you know, fundamentally might solve the problem from a functional perspective.

Dan Olsen
Yeah, well, not the thing, like good enough. That’s the feeling that people have of like, it’s not good enough and good is a very, you know, can be interpreted different ways. One way people concern is not might be good enough is gosh, it doesn’t have enough features, doesn’t have enough functionality. That’s what I was just talking about, right. And so the tendency is to just put in additional features in there, right? Even what you were saying about kind of like, you know, doing something and doesn’t scale on the back end, or Wizard of Oz, or something, you know, I still believe the first thing you should do is a prototypical representation of it, like don’t even build anything, like building is the most expensive thing you can do. Even if it’s like, Hey, we’re building a non scalable, cheap, non scalable way, I would still always use a lower fidelity representation. Now, theoretically, there’s a risk that you’re low your fidelity representation is so crude, like, let me show you a napkin sketch, you know, b2b cmo company of a fortune 100. They’re gonna be like, what do you show me here? Right? So but you’d be surprised you don’t really need that high fidelity? You know, typically, and we’ll talk about we’re gonna talk about it. Yeah, well, I’ll talk about now, like, you know, I like there’s two main prototype, fidelity points as low fidelity, where you would use like a tool like balsamic or something. And that would probably be like grayscale. And, you know, you wouldn’t worry about the fonts and the colours and the pixel perfection, but it would give a sense of what features are in this product. What’s the general layout with the general navigation, what’s the general information architecture and some basic interaction design, and that’s the low fidelity point. And now, the higher fidelity point, you know, you can use a tool like envision, which is awesome, because it can take the image output if any of the you know, the tools designers use like sketch, right Illustrator, Photoshop, and then you as a non designer, non technical person can take those images, export images, upload an image and create little hotspots, right and say, okay, when somebody clicks on this rectangle that I’m overlaying over this button, it should go to the next screen. I’ll tell you that, that at that envision level, that’s where I do most of my product validation. When I help clients we you can get so much done if you do it really well there. And and you know, sometimes you know the easiest thing always do lo fi first and then do hi fi testing. You know, the lo fi what you Get there is you can settle debates about missing functionality and major issues with your UX basically. Right? So the biggest thing is, you know, say a team is arguing about, do we really need this feature in the MVP, and the team is split on it, right? A great way to test that would be let’s, let’s build an MVP, a low fidelity prototype. Without that feature in it, let’s show it to 10 customers, and we’ll see which offer the team is right, you know, and if all if all the customers complain, they will be the first ones to say, hey, you’re right, we need to put feature x in. But maybe just maybe, it’s just you worrying your MVP isn’t good enough. And it’s not based on data or evidence. And we go and talk to customers. And they don’t say anything, because the customers don’t usually complain if you have extra stuff in your prototype. But if you’re missing something critical, they will definitely tell you and they’ll complain.

Lily Smith
So at that point, what would you see is the ultimate purpose of the MVP? Like, when is the best time to use it?

Dan Olsen
The ultimate purpose of it is basically to test your hypotheses. I mean, that’s the whole goal. Right? So the main framework for my book is profit pyramid, it categorises the five hypotheses, right? Who’s our target customer? What are their underserved needs? What is our value proposition? Which is to say, which needs are we going to target? And how are we going to meet those needs in a way that’s better than the competition? What should our feature set be? And what should our user experience design be? So the tip of that pyramid is that UX, you’re working your way up to that UX, and when you have a prototype, it encapsulates and embodies all the assumptions and hypotheses you made along the way. So you’re testing really everything right. So in I would argue the ideal time is before you do any building your coding, right? If option A is you test, a prototype before you code and build, you’re gonna have time to change it, if option B as well, you know, we don’t have time to do that. Let’s just code the alpha and see what people think that’s gonna take longer, right? And then you’re also now you’re anchored in solution space right now. It’s like we’ve already built the library. Gosh, the API calls are designed to do it this way. I know the user feedback was they want to do this anyway. But I’m sorry, we already picked our JavaScript library. And we already said the databases, you’re starting to build in constraints. The second you start building, the tech that starts accruing, and you’re building and constraints so much better. You know, in the example I was talking about later, you know, we did it so quickly, because it was all in prototype suites. Right? So back to your question, it’s there to really kind of validate test, some major assumption that you have, like, do we do we need feature x or not? And then VP, right, you would you would do you would do a prototype and test it basically, right? We’ve identified this value proposition. And we think that if we cobbled together this set of functionality, customers are going to think it’s superior to the existing products are out there. That’s when you would use an MVP to see if that all rings true or not, when you actually test it with people.

Lily Smith
So what a what a you must have worked with lots of different businesses on MVP. So I’m really curious to know, like, what are your sort of favourite examples of MVPs that have proven something to be right, but also kind of ones that have proven something to be wrong?

Dan Olsen
Yeah. Yeah. And I will just say at the outset, like that those are extreme like, right and wrong. 100%? Right. 100%. Wrong, you rarely get those kinds of outcomes, right. It’s all about iterating. And can we get more right? Or is this just a bad idea? We don’t think we do. And I think I think one of the things I can give some specific examples of mine, I think it’d be good to talk about at a high level, what are the different types of MVPs that you hear in going back to that thing about the hardcore product people versus people being looser in their interpretation? So the hardcore product mdps are the things I’ve kind of been discussing wireframes mockups interactive prototypes, they’re meant to be a representation of your product, right? a landing page is a sales pitch for your product, it’s not actually meant to be a representation of your product, right. And that’s where two other ones that come in The Wizard of Oz, we kind of alluded to a little bit, where you’re doing something that isn’t the real technical solution, at the end of the day, but it’s behind the curtain like the Wizard of Oz, so the customer isn’t keen on it isn’t aware of what’s going on, right? And so the famous example there is Zappos right? When when Tony went to sell shoes, he didn’t go buy 1000 shoes, dock a warehouse and then say, Okay, now I’m ready to sell him. He’s his biggest risk was, Why are people gonna buy shoes from a website without being able to try him on this even gonna fly. And so he, you know, got the website up with pictures of shoes, and you can order and then at the end the day, bundle up other words and go down to local shoe store, buy the shoes and mail and that’s called Wizard of Oz. Right? Your customers have no idea what’s going on. Related to that as a concierge MVP, the only difference there is you’re still doing some manual process that is not going to scale. The difference there is you actually you’re telling me you’re not hiding it from your customers, like, Hey, we’re doing white label service, or you are like a small group of people. We’re gonna work with you very closely on this manual process to get it down before we automate it and things like that, right. And then obviously, last but not least, your live product, if you launch an alpha, clearly, that’s everyone agrees. That’s an MVP or a beta. But hopefully, I’ve made a good case for why before you go to the live product, MVP, you should at least use some of these other ones. My biggest the one that I love the most is an interactive prototype, as I had said. So that’s the product hardcore product stuff. Right, and, and I like to dausa dividing the qualitative and quantitative, quantitative, those are all pretty much qualitative, because you’re just going to be showing it to people. On the quantitative side for product stuff. There’s the thick door or the 404 page, right? Where you instead of before you go build the feature, like, let’s see if anyone’s even interested in and I would call this more kind of demand validation. The popular example here is Zynga. Right? When they had were thinking about an idea for a new game, before they would even go too far down that path, they would put an ad for that game in their existing games, right. And then when you click on it, instead of the game being there, you know, the elegant version is, Hey, thanks for expressing interest, we’ll put you on the list when it comes out. It’s called a 404. Because the the better way to do it is just you just go to some 404. And you just track the clicks. So the idea there is you’re looking for what percentage of people actually expressing interest by clicking on this thing, right. And then obviously, like product analytics and AV testing are a good way to test new ideas to like, you know, if you’re considering a new UX design or new feature, you can use a B test and kind of see what’s the true impact or usage that you see. And then that’s the hardcore product stuff. And then I love all the other non product stuff, I would just call it like an under a marketing umbrella, because you’re really just testing, you know, how we talk about the product does it resonate with people, which is valuable, don’t get me wrong, right? Once you’ve gotten clear on your value prop, and how you want to message it, by all means you can test it, but that’s a lot easier, just basically any marketing materials you to show to someone on the qualitative side and see what they think. And then on the quantitative side, that’s where you can do a landing page test, right? You put your best foot forward in a landing page about what you’re planning to build. Well, before you build it, and you see what kind of commercial you out, you make an ask for an email or you make an ask for a credit card. You can see what percentage of people do it. One of the other famous examples is Dropbox is Dropbox, right? Dropbox, he drew us a what’s called an explainer video. And the reason he did that is because it was too complicated to kind of explain just with text and photos in and landing page, because it has to do with sinking across your desktop and the web. And so that was an explainer video, which was a you explain what you’re doing in the video because more complicated so and then another one is crowdfunding sites like Kickstarter, right? If you think about it, they flip the risk on its head instead of building the product and then trying to sell it, you try to sell it first. And if you don’t get back, you’re not even going to you know, bother building it, especially for hardware. That’s a good one. Anyway, that’s kind of the landscape I would say. There’s different VPS

Lily Smith
I you struggling to find the answers to your product questions keen to learn from others in the community and want to know where to go next in your career. Mind the product can help

Randy Silver
mind the product membership will help you to level up your career, build better products and lead successful product teams. And as the world’s largest professional network for product people with decades of product management experience, you won’t find this anywhere else.

Lily Smith
As a member, you’ll get exclusive access to premium editorial product experts and product peers tackling similar challenges. Plus brand new self paced online training modules that cover core product skills, like goals, alignment, prioritisation, hypotheses and testing and more.

Randy Silver
For more info and to become a member today, visit minder product, comm slash join.

Yeah, that’s a really good one. crowdfunding, I don’t want to take it back to the Wizard of Oz. And concierge because there’s a mistake that I think people make. And I’m curious to get your take on it. Which as you build the Envision prototype, you get people trying it. And it’s a wireframe. It’s a walk through. But there’s no actual functionality there. They’re not getting value, they’re not having a problem actually solved. They’re just showing you is this. Are they testing, worthy testing, there’s a usability is a propensity to pay, or are they actually validating that they would use this?

Dan Olsen
Yeah. And that’s an interesting thing, right that I get into in my workshops. And I learned this the hard way. When you test a prototype, you’re going to get a mix of usability, UX feedback, and what I would call true product market fit feedback. Now, here’s the thing, if you have a horrible UX, it’s going to get in the way of getting true product market fit feedback, right? If people can’t even click around your prototype, they’re not going to go This is awesome. As I like to say, it doesn’t matter if you have the world’s best functionality. We have the world’s best search engine, but guess what, nobody can figure out how to use it. It doesn’t they don’t get credit, or you don’t get customer guide. And there’s a great framework that I like to use an MVP is, you know, as a podcast, you can’t show an image. But it comes from Aaron Walter was the head of UX design at MailChimp. And I remember when I first used MailChimp, it was like head and shoulders way better UX than all the other email campaign programmes that are out there, right? I mean, it’s just so easy to use, and had a funny playful tone. He logged in it’s like hey, you’re looking good today Dan, all kinds of funny stuff. So anyway, he wrote a book called designing for emotion. I’m not surprised that he wrote it given what I saw with MailChimp. And in that and then you know, I someone else adapted that designer named Jesse percent and adapt It is framework. And then I adapted this, I just want to give those guys props because the original idea was in my, I just kind of adapted it. The idea is you’ve got a pyramid, there’s like a hierarchy of four, four things. And the bottom of the pyramid is functionality, how functional is the product? What does it actually do for you? The next layer up is reliability. Like, is a work as expected as a buggy? You know, what’s the quality level? The next level up is usability. Okay, can I actually use it? How easy to use is it? And then the final tip of the pyramid is delight. Right? And so they goes delight, usability, reliability functionality, top to bottom. Now one of my clients got tired of writing those out. And so these put d u RF. So we’ve named it the derf Pyramid, for sure, basically, but the whole reason of that pyramid was you know, about 15. year plus years ago, everybody got the memo? Hey, usability matters, right? For a while people like yeah, whatever, they have to use it. It’s b2b software, or whatever. But now everybody knows. Now not everyone’s still building great UX is but they can’t say they don’t know that they should at this point, right. So the whole point was to elevate the discussion if usability answer the question, can use the product, delight, answer the question, how do they feel when they use your product? Do they want to use your product? That was the point was to elevate that, right? And so back to your point, you know, if you’ve got bad usability, it’s going to help you, it’s going to prevent you from getting a true read on how what’s the product market fit with the functionality that we’ve built, right. And I had this happen firsthand, when I had my startup, we were launching at TechCrunch. And it had to be a private beta, because we had a launch at the event. And so I’d bump into someone at a coffee shop in Palo Alto, and they be like, Hey, what do you do? And I’m like, Well, you know, I’m doing startup, like, oh, cool, can I come check it out? And I would always say, yeah, you can check it out. But the price of entry to our private beta is you gotta let us watch, you go through the news or flow and sign up. And you know, I try to do the best job I could on the on the features and the UX. But in the first 15, or 20, tests, we ran into some issues and usability issues. People like I don’t understand what this term means. I need some examples. We found a few bugs, you know, we just improved the new user flow. And so then, you know, we had a small team. So we did it very quick. Because after about a test 2025, those issues went away. We weren’t hearing those issues anymore. So then I got a little confident how cool these tests are going pretty good at the end of tests, or ask people, would you use my product? would you use this product and they just do this session went great, there were no major complaints or concerns. And to my surprise, about 15% of people, even though the test went really well, we’re like, No, I never use your product. But I was just like stuff like, What? What do you mean, like, you didn’t complain about anything during the day. And what it was is, it turns out to make sense that different people like to get news in different ways, kind of like a philosophical approach to how you’d like to get your your news met, right. And it turns out that unbeknownst to my co founder and I, we had kind of designed it for the way we like to get news, which Lucky for us was about 50 60% of the market, roughly based on our rough research, after we learn about this, another 20 30% of people like to get a different way. But luckily our design work for them. But that last 15% of people, they just there’s no way our design was gonna work for them. Right. So that illustrated to me, Randy and Emily, like, there was no usability issues we had in there after 2025 tests, we’ve eliminated all these ability issues, they were getting through the flow, getting to the main product using an understanding of fully, and then not wanting to use it. And so for me, that’s the difference between usability feedback and part of our feedback. So yes, bad UX can get in the way of testing that. And you need to kind of iterate your way past that, you know, to get there,

Lily Smith
this semester, kind of back and all of that. But I just want to go back to a question which I kind of mentioned earlier. And what are some of the kind of the good examples of MVP is kind of really shining a light on the direction to go in?

Dan Olsen
Yeah, thanks. I just want to provide that context on the different types. So the two, the one I talked about the most in most in the book, and the workshop is a project I worked on called Market Report Comm. And this is great, because the startup CEO, they had an existing product. And he had such a small Dream Team. He said, we can code anything, we got to use protests, I’m like, awesome, we’re going to use prototypes. You know, we did the whole process and came up with what we thought and honestly, gave me an idea for the product was like the CEOs pet idea of this idea of a kind of a marketing report, you know, to so like, Why do you get the junk mail, the marketing mail that you get, and it was gonna have like a marketing score, like, like your credit score, so he did your thing, your credit industry? So he’s like, making the jump of can we do something like that happened in the credit industry for marketing. And so we, you know, went down the steps, and we built a vision prototype, you know, and we tested it. And there were tonnes of questions and concerns, like people, people were not excited about it. Right. And we had tested a few different features besides the core thing. Luckily, we tested two secondary ideas, and the secondary ideas kind of resonate. I mean, the test did not go well. Again, there was so much confusion and comments and stuff people like now I’m not gonna go for this. But the the main idea the CEO had was I did not resonate at all that was kind of unsolvable if you will. But the the secondary ideas, even though there are a lot of questions that currently felt like questions and concerns, we felt like hey, We feel like we have a good handle on this if we pivot, and we only focus on this one benefit, and we take all the feedback we have, we think there’s an opportunity for customer base. That’s what we did. So with one pivot one iteration, we completely change the concept, refocused it much tighter scope, like the old first MVP, maybe five different benefits or features. This one just had like one or two. And it went really deep and applied all the learning that we got from the research. And the second time around, it was like night and day, people were like, there were still some questions and concerns, but they were much smaller. And both times we asked people, Hey, would you pay 10 bucks a month for the service. So also, if you’d ask people if they would pay, they don’t actually have to actually pay you. But it was like a night and day difference. And then also, the funny thing was, you know, the way these sessions work is you say, hey, Randy, come, come spend 45 Minutes with Us, we’ll give you an Amazon gift card, or we’ll give you a check, you know, for 100 bucks or something. And at the end of the test, people usually take their check, I’ll be done. Okay, cool. They take their check, and they run. But at the end of this test, everyone’s like, they stayed there. And like, Oh, so is this live now? Can I go use it. So it was really exciting to see that with very attention, careful attention to detail what we learned from customers and pivoting iterating, which again, we did in about four weeks, because we were doing with prototypes, and we had no qualms throwing the original prototype out the window, and starting with a fresh slate. If that had been live code, I don’t think I think we had a little more sunk cost fallacy going on there.

Lily Smith
Do you kind of often take different variants of prototypes, like to customers, because that’s when you when you’re kind of early stage with a with an idea or concept, there’s often like a few different ways that you could approach it?

Dan Olsen
Yeah, that’s a great question. You know, it’d be easy again to say, yeah, test everything. You know, it takes time though, right? So you don’t want to kill yourself with too many options. Um, so the reason the main reason we did that I kind of skipped over it is the original feature set that the collective team at brainstorm had like, eight or nine features in it. And when it came time to get to the MVP stuff, I’m like, I don’t want to put all eight or nine. And that’s like a kitchen sink MVP, with everything that’s like trying to throw as much against the wall. So what I did is I said, Let’s take the core idea of this marketing score stuff and combine it with these logical group have two other features. Let’s take that. And that’s one MVP. So it’s all about feature scan feature set definition, right. So like, that’s the scope for that MVP. And the other one is to take the core idea, plus these other ideas that’s been rescinded, there would have been like, way too much in a single MVP would have been all over the place. But to your point, you know, if you think about it, you know, you might, you have to decide how far down the pyramid Are you gonna go with variants. So one easy thing to do is, like we’ve just said is not very good online on the feature set, we have two fundamentally different UX design approaches that we can do, right? We can do, we can approach it with this kind of a model for UX design concept designer this way, that’s a great point to just test, mock up both and test both and see what resonates. What an extreme version of that I did recently, for a client, it was awesome with e commerce client, on all these ideas. And so like, I created, we probably had like 12 different variants of this like shopping pane that you could have in your browser, right. And in variant one, we focus just on like this would believe, but a minimal one would look like they don’t ask you for your few things. Here’s what the next idea is I could write. And we also had ideas about what when would you transition to the website, and we had different ideas there. So that was pretty cool. And then what you can do is kind of on the fly after a few tests go, Okay, this isn’t working, we’re hearing this let’s riff and iterate. So I’ve done that rapid iteration, where in the test, you know, after a few tests, you’re coming up with a slightly modified concept that you then test with people again, right, so Um, so yeah, I do think modelling different UX can be helpful. It’s as long as you’re not doing it with like, trying to throw as much spaghetti against the walls. I’m just very like, discipline. And like, here’s why we’re testing these five or six. That makes sense. I’ll give you example, one that fails, so that when it started out my doing well, but we pivoted to success. One word we just didn’t pivot to success was a startup, it was a while ago, and the startup didn’t go anywhere. And I won’t say who it is, I’m comfortable talking about it. But it basically like the social finance idea of like, hey, lenders, you know, lend to you based on the data they have, why don’t we try to tap everyone’s like social networking data, LinkedIn, this then and Facebook? And somehow, you know, use that too. And it was a good mission, like for people that couldn’t otherwise get credit? Could we somehow tap into the data available online about them to get them credit? Sounds like a good idea. innovative idea. So we brainstorm all the features. We did some mock ups, and then I tested people, and they were just really really uncomfortable with giving their like credentials to give you can’t there was enough like question mark about even if we get the data, how much machine learning and what can we actually do with the data. But we found out to our surprise, you know, we, you know, obviously there’s different privacy segments, but even the people we tried to like screen out for like not being too concerned about privacy, even they were still concerned. And so we realised, hey, this isn’t the fundamental presumption here. You know, again, going back the risky assumptions or people even let us get this data for them? It turned out that didn’t work out. So yeah.

Randy Silver
Taylor could talk to you about this for hours and hours. But I have one more question. I wanted to make sure we hit most these examples. Were there you we’ve been talking about are at a fairly early stage in product development. Is that the only time you should be using an MVP? Is it only at New that Inception and discovery stage? Or is it a tool that you use in a continuous way?

Dan Olsen
Yeah, I wouldn’t use them in a continuous way. Because I mean, basically, it easiest to envision how you could use it in the one right, if that’s what I keep going back to, but like, you know, big companies that have existing products are constantly adding new features. And doing new big project means that maybe they’re launching a new product, maybe just prior, but even when you do a new product, right? It I guess it goes back to what’s the risk and reward here, right? So as I like to say, if it’s like, Hey, we just have this bug that we know what is needed to fix it, you know, do we need to go test it and do an MVP and prototype But no, right? So it’s kind of related to the level of uncertainty you have in your assumptions, and what the risk of being wrong is, right? And And that, again, if you’re pro version, do we have to build it and then test it? Well, then the cost to test it is pretty high. So then the bar for, for how much uncertainty and importance it’s going to have to have is higher. But if you build this rapid prototyping muscle in your organisation, then the threshold becomes much lower. And it’s okay, before we launch any new functionality, why wouldn’t we create an interactive prototype and show it to 10 users, I guarantee you, when you do that, you’re going to learn and find stuff that’s going to make that feature better, you can prevent, you know, like that social financing we talked about, you can prevent things, launching things that aren’t going to make any difference that people aren’t going to use. They can give you improvement ideas, right. So and you have to as I like I said, you have to do the UX design Anyway, you got to tell the front end developers what to build. So it’s not really that much incremental effort to say, hey, let’s go run this by some users before we put it into user stories and, and ship it, you know, give it to dev to build, let’s just go test. Right. So, you know, I think you can use it, obviously, again, if the scope is so small, and the uncertainty is very low, then it’s not. You don’t need to do it. But you know, otherwise, I think it’s a valuable skill to build in your org to be able to do that rapid testing, right. And some of the plays one of the biggest challenges honestly, is great, Dan, we’re up for being customer centric, refer being lean referred doing this. Okay, we got a prototype ready. Okay, Randy, go go line up 10 users now? It’s like, well, dude, is it gonna take me like two weeks, find some users. And by then the trains left the station, right. So it sounds so mundane. But literally, it’s finding and scheduling the users is one of the top obstacles to people doing this more. And that’s why, you know, the best solutions that I’ve seen are, you know, unfortunately, PMS, if our job description is never ending, it’s always stretching out. It’s like, you know, if no one else is there to schedule these interviews, it’s gonna fall on us, and we’re might be too busy. So you see people that are like, research, you know, associates or research operation, it’s not a tough job, yes, I find that you have to find the people and schedule them. It’s really not that, you know, it’s just kind of that tougher job. So you can have a part time person, a junior person, right? Um, do that. And I think the best way out of the trap is, instead of waiting till you have the prototype ready, just have that person schedule, like, you know, six users the first week, or every month or something. And yeah, you don’t know at the time that you schedule them what you’re gonna ask them about, but you’ll know that you’ll have them ready to do them, right. So that’s the kind of the way all the jobs, I see people that are all for it. And they go to try to schedule it at the moment. And then it’s too late at that point in time,

Lily Smith
then, thank you so much. I was gonna try and sneak in one more question, which was, what was your top tip for few more things and take on MVPs in that process, but you I felt like you’re kind of covered it with that tip of like, make sure you get your users scheduled and as soon as possible. So unless you’ve got anything else you want to add,

Dan Olsen
when you can add on greetings. I’ll tell you the, the top two mistakes people make right. The one is even it’s ironic. The first is they over scope, their MVPs. And it’s ironic, because the whole point of an MVP is to not over scope, whatever milestone you’re planning, right? But because of that, gosh, it’s not good enough. Gosh, you know, we didn’t get into it. But obviously, you know, some key stakeholder can be like counting on the fiscal and I can’t believe we’re gonna launch this product without feature x. I think IBM or Google or insert big company name here is gonna freak out, you know, you get a lot of that stuff going on. And then the poor product team just says, okay, and they had a feature and the next day and other stakeholders says the same thing. The next thing you know, you’ve got like in a bloated MVP, so it takes it’s very tough to have the discipline. And I would just say that’s where wireframes can help you out. If you’re having a tough debate on your team, instead of just biting the bullet and putting the feature in and delaying your launch. Once you say, hey, let’s take a timeout, let’s do a wireframe and see if people complain about this feature missing. Now, that’s the first mistake. The second biggest mistake, we kind of touched on with the derf Pyramid, right? With the delight, these ability, reliability, and functionality. The whole point of that pyramid is a very famous picture online, where it says it shows how you slice the pyramid. And it says how not to slice it is to not just slice it horizontally across a subset of the functionality. What does that mean? That means, hey, we’re launching a subset of the functionality which follows the advice I just gave, which is great, but it means you’re kind of ignoring reliability, usability and delight. And, and the way that goes down is, yeah, I know, I know, gosh, I know this got some bugs. But man, we’ve got we’ve got to ship this thing on March 31. It’s just an MVP. We’ll fix the bugs later. Right. By the way, it’s amazing how many software packages shipped on March 31. And October 31, these magical dates at the end of the quarter, right. I know we should have done some user testing, you know, we should have done more someone UX design, but Gosh, we’ve got this deadline. It’s just an MVP. So the users, it’s just an MVP as an excuse to like, not worry about reliability, or usability or delight. And then when you pass those fees, how do you think they do back to the previous point doesn’t matter if you have great functionality. If the UX is bad, or the reliability is bad, it’s not going to test well. And that’s when usually people go I’ll see this whole lean MVP thing doesn’t work. Let’s go back to waterfall. So instead is true, you want to build a slice, but the slice is like a vertical slice a thin vertical slice that picks up some of those top layers. So whatever subset of functional you do bite off, it’s not going to be perfect, but it’s going to be reliable enough, usable enough and delightful enough to truly test the MVP concept. So those two more tips for people on the way out the door.

Lily Smith
Thank you so much. That was fantastic. Thank you, man. It’s been great talking to you today.

Randy Silver
Hey, you know, this whole reminds me Lilly, you know, the original name for this podcast was going to be minimum viable podcast.

Lily Smith
I never It was terrible. And I was never going. And the testing we did around it validated the decision.

Randy Silver
Yeah, fair enough. It totally did. And you know, the nice part is we didn’t even have to build any mock ups to learn that lesson.

Lily Smith
So true. So if you like this podcast, that’s not called Minimum Viable podcast, and then like and subscribe and leave us a review. We haven’t had one for ages.

haste, me, Lily Smith and me Randy silver. Emily Tate is our producer. And Luke Smith is our editor.

Randy Silver
Our theme music is from Humbard baseband power. That’s p au. Thanks to Ana kittler, who runs product tank and MTP engage in Hamburg and plays bass in the band for letting us use their music. Connect with your local product community via product tech or regular free meetups in over 200 cities worldwide.

Lily Smith
If there’s not one Nagy you can consider starting one yourself. To find out more go to mind the product.com forward slash product tank.

Randy Silver
Product tech is a global community of meetups driven by and for product people. We offer expert talks group discussion and a safe environment for product people to come together and share greetings and tips.

 

The product experience speaker [buzzsprout episode='8789499' player='true'] Want to make a veteran product person sad? Just ask them how they've used MVPs in the past.  Once you get past the sobbing, you're likely to get stories of the myriad ways that a perfectly good experiment got turned into a v1 release. It doesn't have to be like that! Dan Olsen joins us on the podcast to teach Lily and Randy how to actually control the scope of an MVP, guide us through some of his favourites, and why MVP failure can be considered a success. Featured Links: Follow Dan on LinkedInTwitter and his Website|Read Dan's The Lean Product PlaybookTypes of MVPs | Lean Product Meetup | The original MVP article How to successfully validate your idea with a Landing Page MVP  - and the follow-up four days later | A Landing Page is NOT an MVP | Aaron Walter's book Designing for Emotion

Episode transcript

Lily Smith This week on the product experience podcast, we've invited back our good friend, Dan Olsen. Randy Silver Oh, cool. I love chatting to Dan. He's the author of the lean product playbook and a product trainer. And he's just done a tonne of stuff in product over the years. He, he really knows his stuff. Lily Smith He really does. And last time, we covered the kind of tricky concept of product market fit. So this time, we just had to get the king of lean startups back in to cover the notorious MVP. Randy Silver Yeah, I see what you did there. Really, the MVP sure is notorious. But if you want all your MVP myths actually debunked, then just listen on. Lily Smith The product experience is brought to you by mind the product. Randy Silver Every week, we talk to the best product people from around the globe about how we can improve our practice, and build Lily Smith products that people love. Because it mind the product calm to catch up on past episodes, and to discover an extensive library of great content and videos, Randy Silver browse for free, or become a mind the product member to unlock premium articles, unseen videos, ama's roundtables, discounts to our conferences around the world training opportunities. Lily Smith Mind the product also offers free product tank meetups in more than 200 cities. And there's probably one for you. Hi, Dan, welcome back to the product experience podcast. So lovely to have you back in the studio. Dan Olsen Yeah, totally. Man. It's great to be back. I don't remember exactly how long ago it was. But it was fun back then. So I'm really excited to be back talking with you all again today. Lily Smith It's definitely been a while and probably too long. But thank you so much for coming back to join us today at but for anyone who hasn't yet heard of you, which I can't believe there will be anyone listening to this who hasn't come across some of your content? And do you want to give us a real quick intro into who you are and your kind of background in product? Dan Olsen Sure, sure. I'm happy to My name is Dan Olsen. And I've been working in product management most of my career. These days, I for quite a while now. I've been a product management trainer. I'm largely teaching private training workshops to companies. I also do public workshops, Speaker conferences, like mine, the product and other product conferences. And I've also done consulting in the past product management consulting. And besides, I started my career into it as a product manager, which was a great place to learn product management back in the day, and then work at startups before getting into doing my own thing as a trainer and consultant. And then two other things I was mentioned, I'm the author of the lean product playbook came out six years ago now a little over six years ago, which is kind of a guide to achieving product market fit. And it's actually funny a funny fact, is the my editor reached out to me because his original thesis was to write a book on MVPs. And so I was like, Well, I'm VPS okay, but why don't we zoom out a little bit and just talk about you know, how to build successful products and we'll cover them up. So in the in the, you know, when they do a book, there's always like a subtitle a second title, you know, so in the in the subtitle, we retain that MVP. It's like how to innovate with minimum viable product and rapid customer. But of course, nobody refers to the subtitle. So anyway, um, yeah, certainly, they look. And then also, I, for a little over seven years now have built a product community called the lean product meetup here in Silicon Valley, where each month they bring in a top speaker, and a lot of the MTP conference, speakers have spoken at my meetup. And we were over 10,000 people now and with with COVID, we took it online. So it's actually been awesome. We now have a global audience. So so I'm a big believer in sharing best practices and building community. So that's, that's the kind of things that keep me busy these days. Awesome. Thanks, Dan. Lily Smith And your point about MVP. So our topic today is all about MVP. So before we get stuck into the nitty gritty, let's talk about sort of why the MVP is such a like hotly debated topic, and people get so obsessed, or whether it's like a good thing, and whether it's this or that or the other. And people seem to be very passionately on one side or another side or have very strong opinions about it. So what do you think it is about the MVP that has stirred up so much emotion in the product community? Dan Olsen Yeah, I mean, we have a lot of, you know, acronyms and concepts in the product management world, but I can't think of one that is more divisive than MVP, like, you know, within the companies across companies, between PMS. You know, people just have different understandings and conceptions of it. They're often very, very strong opinions of it. And for me, part of the evidence, you know, is you see different acronyms. You've got MVP, minimum viable product, then people didn't like the fact that that was, you know, that sounds kind of like you're cutting corners. It's going to be a little shoddy. Let's do the minimum lovable product right. So you know, there's like people Making a minimum sellable product, you're about minimal sellable product, minimum viable feature. Right. So anytime you see people kind of reinterpreting and creating their own acronyms, you know, there's a difference of opinion and something's going on there. And, and I remember, back in the day, you know, NDP got popular with the Lean Startup movement, Eric Reese's book, right became much more popular. I remember around shortly thereafter, someone wrote a blog post, and it was called, here's lessons learned from here's your lessons that we learned, you know, using the landing page MVP, to kind of test out our test our hypotheses, you know, it was a perfectly fine article. And somebody does really got upset that this person could possibly think that a landing page could possibly be an MVP. And they wrote a counter post called, The title was just a landing page is not an MVP, like that was the title. They were so upset. And so I have screenshots of those. When I teach my work, I'd like put, and I'll just do a poll with my audience. I'd be like, okay, yes or no, let's settle this once. And for all, is it a landing page MVP, or not, you know, and usually get about, you know, 50% plus or minus 10%? voting, picking aside if you will, right. And it's just so funny, because and the way that goes down is, you know, the hardcore people are like, of course, it's not an MVP, what are you talking about? Don't you know what the P stand for? It stands for product, product, a landing page in our product, you can't use a landing page, it just sells you something. So no, no way, MVP product, landing page letter product, no, doesn't compute. And then the more laid back people like, well, man, hey, anything you learn from is cool. It's an MVP, why can't why you got to be so judgy, about a man anything. So it's a very, there's a very strict interpretation. And there's a very loose interpretation. And I've tried, I don't think you can ever get those two parties to agree, there's never going to agree. So what I propose in my book, and what I do is like, let's instead of arguing, you know, black versus white, let's elevate go up one level, let's call them all MVP tests, or MVP experiments, right? Because that gives the hardcore people, the options like that you can reserve MVP for the hardcore stuff that you only consider a truly product stuff. But can we at least agree that a landing page could be an MVP tester experiment? And they'll begrudgingly say, Yeah, I guess so. You know, so that's the way out of it. Otherwise, there's no real way out of it. And so, you know, it's just, it's that that's the fundamental difference is does it have to be live product is the hardcore view, is the most extensive review, the softer version of that somewhat soft version is? Is it a representation of a real product, which a landing page still is not, you know, like, like a prototype or something. And then the most liberal, easy interpretations, anything that you're testing a hypothesis, you know, of? That. So that's kind of the spectrum of interpretations that we see, Randy Silver I think you've just given us get another acronym, and its MVP for the minimum viable. The acronym factory is cranking them out. I love it. Lily Smith So how would you let's just to kind of take it down a notch into into some of the sort of the detail, I guess, for those who haven't heard of the term? How would you sort of describe it to anyone new to the concept of a minimum viable product? Dan Olsen Yeah, I mean, the general idea, you know, sometimes people will say, What does MVP mean to you, and they'll just say, a minimum viable product, my guests go, but what does it mean to you is like, the smallest, you know, amount of functionality that creates enough value for a customer, that's, that's kind of the generally agreed upon idea is, is that's the idea. Like, you don't want to build anything more than you have to, you could err on the side of building more than you have to that would be exceed the minimum viable product. You could also add on the side of being too too skimpy, you know, that's the Bible. And you know, the customer gets to say, is viable, what's viable, right? So you say, Hey, we think if we just launched this product with these five features, that's going to be viable, and you launch it and they go, now you're missing this key feature that I need, then that would not be viable. So that's the general idea. And I think some important concepts, you know, is that it's reduced scope. You know, it's kind of like the anti waterfall mentality, instead of being like, well, we're gonna launch the Cadillac version, or the Mercedes Benz version. It's like, well, what would the Honda Civic version look like, right? And one of the popular frameworks is floating around out there. I think it's from Spotify, where it shows like, how not to build an MVP, and it should building like a car, and it's like, first to build a tire and then it's like, the frame with four tires. And then it's like, you know, the body. So the point of that framework is, hey, it's not just about building things incrementally, because we get plenty companies doing supposedly doing agile, just doing waterfall in two week chunks, right? It's that you want to create some intermediate milestone and sort of the big deliverable like waterfall, but that intermediate milestone has to actually create customer value, right? So that's part of it too. So it should be reduced, reduced in scope. That's like everyone kind of agrees on that aspect, the MVP. I really like MVPs that are also what I would say is a lower fidelity representation of what you plan to build. So not only are you saying okay, instead of doing the whole thing A lot of scope of all 10 features, we're gonna see if we can get by with an MVP of these four features. Then when you do the four features, you don't say, well, step one is build the four features and see what people think, for me about step one is let's build a lower fidelity prototype of what an MVP with those features would look like and how it would work. And let's test that. So there's a kind of two distinct dimensions of an MDP being reduced, in some sense from the full product, you're reducing the scope. And ideally, you're also reducing the fidelity with a prototype prototypical representations. Now, even if you're not doing the second, there's still value in MVP thinking of reducing your scope. Right. And and one of the things I see when people err on the side of thinking of the building, is people thinking, Oh, yeah, I get how for it, we're building a blue sky, Greenfield project, product v1, totally get that we got to figure out the MVP. But that's like the only some people that's the only scenario that they really resonates with for them. And the reality is, it's a mindset, you know, whether you're building version two or 1.1, you still have to answer the question, what's just enough functionality to create enough customer value? And what would be too much? Because the general idea is sure, we can build more, but then we're going to delay the launch, we're going to delay the revenue recognition, we're going to delay the learning and all that jazz, right? So honestly, it applies every sprint, every sprint, you know, good product teams like what really needs to be in this brand. And what can wait. Right? That's really the fundamental essence of that question is what really, really needs to be here. And the challenges most people and the side of Yeah, let's throw in the extra feature. And it's subconsciously, it's like, the more features we have, the higher the odds that we something the customer likes, and they're the higher the odds of success are there. And it's kind of like a cop out. It's a pun, instead of being like, No, we really know our customers, we really know their needs, we really know that we really need these features to meet those needs. And these other feature ideas that sound interesting, aren't critical, you know, so it's, it ends up being about saying no, right, which is also like my favourite definition of strategy is saying no. And if nose seems a little too harsh, it just means not now, not for the milestone, not for the product release milestone that we're talking about is sometime after that. Randy Silver So it's easy enough to look at this and say, right, I can totally get this for the back end, there's that whole thing of build something that doesn't scale, prove that it works, prove that you need scale, and then scale it. So you can skimp on backends you can make something put in manual processes that you can automate later, things like that. But for the front end, for the feature itself, how do you know? What's just enough? How do you know that if you do it, too, too low fidelity that people are just going to be responding to and saying, I don't like it, I don't need it. Because it's not nice enough, it's not good enough, even if it, you know, fundamentally might solve the problem from a functional perspective. Dan Olsen Yeah, well, not the thing, like good enough. That's the feeling that people have of like, it's not good enough and good is a very, you know, can be interpreted different ways. One way people concern is not might be good enough is gosh, it doesn't have enough features, doesn't have enough functionality. That's what I was just talking about, right. And so the tendency is to just put in additional features in there, right? Even what you were saying about kind of like, you know, doing something and doesn't scale on the back end, or Wizard of Oz, or something, you know, I still believe the first thing you should do is a prototypical representation of it, like don't even build anything, like building is the most expensive thing you can do. Even if it's like, Hey, we're building a non scalable, cheap, non scalable way, I would still always use a lower fidelity representation. Now, theoretically, there's a risk that you're low your fidelity representation is so crude, like, let me show you a napkin sketch, you know, b2b cmo company of a fortune 100. They're gonna be like, what do you show me here? Right? So but you'd be surprised you don't really need that high fidelity? You know, typically, and we'll talk about we're gonna talk about it. Yeah, well, I'll talk about now, like, you know, I like there's two main prototype, fidelity points as low fidelity, where you would use like a tool like balsamic or something. And that would probably be like grayscale. And, you know, you wouldn't worry about the fonts and the colours and the pixel perfection, but it would give a sense of what features are in this product. What's the general layout with the general navigation, what's the general information architecture and some basic interaction design, and that's the low fidelity point. And now, the higher fidelity point, you know, you can use a tool like envision, which is awesome, because it can take the image output if any of the you know, the tools designers use like sketch, right Illustrator, Photoshop, and then you as a non designer, non technical person can take those images, export images, upload an image and create little hotspots, right and say, okay, when somebody clicks on this rectangle that I'm overlaying over this button, it should go to the next screen. I'll tell you that, that at that envision level, that's where I do most of my product validation. When I help clients we you can get so much done if you do it really well there. And and you know, sometimes you know the easiest thing always do lo fi first and then do hi fi testing. You know, the lo fi what you Get there is you can settle debates about missing functionality and major issues with your UX basically. Right? So the biggest thing is, you know, say a team is arguing about, do we really need this feature in the MVP, and the team is split on it, right? A great way to test that would be let's, let's build an MVP, a low fidelity prototype. Without that feature in it, let's show it to 10 customers, and we'll see which offer the team is right, you know, and if all if all the customers complain, they will be the first ones to say, hey, you're right, we need to put feature x in. But maybe just maybe, it's just you worrying your MVP isn't good enough. And it's not based on data or evidence. And we go and talk to customers. And they don't say anything, because the customers don't usually complain if you have extra stuff in your prototype. But if you're missing something critical, they will definitely tell you and they'll complain. Lily Smith So at that point, what would you see is the ultimate purpose of the MVP? Like, when is the best time to use it? Dan Olsen The ultimate purpose of it is basically to test your hypotheses. I mean, that's the whole goal. Right? So the main framework for my book is profit pyramid, it categorises the five hypotheses, right? Who's our target customer? What are their underserved needs? What is our value proposition? Which is to say, which needs are we going to target? And how are we going to meet those needs in a way that's better than the competition? What should our feature set be? And what should our user experience design be? So the tip of that pyramid is that UX, you're working your way up to that UX, and when you have a prototype, it encapsulates and embodies all the assumptions and hypotheses you made along the way. So you're testing really everything right. So in I would argue the ideal time is before you do any building your coding, right? If option A is you test, a prototype before you code and build, you're gonna have time to change it, if option B as well, you know, we don't have time to do that. Let's just code the alpha and see what people think that's gonna take longer, right? And then you're also now you're anchored in solution space right now. It's like we've already built the library. Gosh, the API calls are designed to do it this way. I know the user feedback was they want to do this anyway. But I'm sorry, we already picked our JavaScript library. And we already said the databases, you're starting to build in constraints. The second you start building, the tech that starts accruing, and you're building and constraints so much better. You know, in the example I was talking about later, you know, we did it so quickly, because it was all in prototype suites. Right? So back to your question, it's there to really kind of validate test, some major assumption that you have, like, do we do we need feature x or not? And then VP, right, you would you would do you would do a prototype and test it basically, right? We've identified this value proposition. And we think that if we cobbled together this set of functionality, customers are going to think it's superior to the existing products are out there. That's when you would use an MVP to see if that all rings true or not, when you actually test it with people. Lily Smith So what a what a you must have worked with lots of different businesses on MVP. So I'm really curious to know, like, what are your sort of favourite examples of MVPs that have proven something to be right, but also kind of ones that have proven something to be wrong? Dan Olsen Yeah. Yeah. And I will just say at the outset, like that those are extreme like, right and wrong. 100%? Right. 100%. Wrong, you rarely get those kinds of outcomes, right. It's all about iterating. And can we get more right? Or is this just a bad idea? We don't think we do. And I think I think one of the things I can give some specific examples of mine, I think it'd be good to talk about at a high level, what are the different types of MVPs that you hear in going back to that thing about the hardcore product people versus people being looser in their interpretation? So the hardcore product mdps are the things I've kind of been discussing wireframes mockups interactive prototypes, they're meant to be a representation of your product, right? a landing page is a sales pitch for your product, it's not actually meant to be a representation of your product, right. And that's where two other ones that come in The Wizard of Oz, we kind of alluded to a little bit, where you're doing something that isn't the real technical solution, at the end of the day, but it's behind the curtain like the Wizard of Oz, so the customer isn't keen on it isn't aware of what's going on, right? And so the famous example there is Zappos right? When when Tony went to sell shoes, he didn't go buy 1000 shoes, dock a warehouse and then say, Okay, now I'm ready to sell him. He's his biggest risk was, Why are people gonna buy shoes from a website without being able to try him on this even gonna fly. And so he, you know, got the website up with pictures of shoes, and you can order and then at the end the day, bundle up other words and go down to local shoe store, buy the shoes and mail and that's called Wizard of Oz. Right? Your customers have no idea what's going on. Related to that as a concierge MVP, the only difference there is you're still doing some manual process that is not going to scale. The difference there is you actually you're telling me you're not hiding it from your customers, like, Hey, we're doing white label service, or you are like a small group of people. We're gonna work with you very closely on this manual process to get it down before we automate it and things like that, right. And then obviously, last but not least, your live product, if you launch an alpha, clearly, that's everyone agrees. That's an MVP or a beta. But hopefully, I've made a good case for why before you go to the live product, MVP, you should at least use some of these other ones. My biggest the one that I love the most is an interactive prototype, as I had said. So that's the product hardcore product stuff. Right, and, and I like to dausa dividing the qualitative and quantitative, quantitative, those are all pretty much qualitative, because you're just going to be showing it to people. On the quantitative side for product stuff. There's the thick door or the 404 page, right? Where you instead of before you go build the feature, like, let's see if anyone's even interested in and I would call this more kind of demand validation. The popular example here is Zynga. Right? When they had were thinking about an idea for a new game, before they would even go too far down that path, they would put an ad for that game in their existing games, right. And then when you click on it, instead of the game being there, you know, the elegant version is, Hey, thanks for expressing interest, we'll put you on the list when it comes out. It's called a 404. Because the the better way to do it is just you just go to some 404. And you just track the clicks. So the idea there is you're looking for what percentage of people actually expressing interest by clicking on this thing, right. And then obviously, like product analytics and AV testing are a good way to test new ideas to like, you know, if you're considering a new UX design or new feature, you can use a B test and kind of see what's the true impact or usage that you see. And then that's the hardcore product stuff. And then I love all the other non product stuff, I would just call it like an under a marketing umbrella, because you're really just testing, you know, how we talk about the product does it resonate with people, which is valuable, don't get me wrong, right? Once you've gotten clear on your value prop, and how you want to message it, by all means you can test it, but that's a lot easier, just basically any marketing materials you to show to someone on the qualitative side and see what they think. And then on the quantitative side, that's where you can do a landing page test, right? You put your best foot forward in a landing page about what you're planning to build. Well, before you build it, and you see what kind of commercial you out, you make an ask for an email or you make an ask for a credit card. You can see what percentage of people do it. One of the other famous examples is Dropbox is Dropbox, right? Dropbox, he drew us a what's called an explainer video. And the reason he did that is because it was too complicated to kind of explain just with text and photos in and landing page, because it has to do with sinking across your desktop and the web. And so that was an explainer video, which was a you explain what you're doing in the video because more complicated so and then another one is crowdfunding sites like Kickstarter, right? If you think about it, they flip the risk on its head instead of building the product and then trying to sell it, you try to sell it first. And if you don't get back, you're not even going to you know, bother building it, especially for hardware. That's a good one. Anyway, that's kind of the landscape I would say. There's different VPS Lily Smith I you struggling to find the answers to your product questions keen to learn from others in the community and want to know where to go next in your career. Mind the product can help Randy Silver mind the product membership will help you to level up your career, build better products and lead successful product teams. And as the world's largest professional network for product people with decades of product management experience, you won't find this anywhere else. Lily Smith As a member, you'll get exclusive access to premium editorial product experts and product peers tackling similar challenges. Plus brand new self paced online training modules that cover core product skills, like goals, alignment, prioritisation, hypotheses and testing and more. Randy Silver For more info and to become a member today, visit minder product, comm slash join. Yeah, that's a really good one. crowdfunding, I don't want to take it back to the Wizard of Oz. And concierge because there's a mistake that I think people make. And I'm curious to get your take on it. Which as you build the Envision prototype, you get people trying it. And it's a wireframe. It's a walk through. But there's no actual functionality there. They're not getting value, they're not having a problem actually solved. They're just showing you is this. Are they testing, worthy testing, there's a usability is a propensity to pay, or are they actually validating that they would use this? Dan Olsen Yeah. And that's an interesting thing, right that I get into in my workshops. And I learned this the hard way. When you test a prototype, you're going to get a mix of usability, UX feedback, and what I would call true product market fit feedback. Now, here's the thing, if you have a horrible UX, it's going to get in the way of getting true product market fit feedback, right? If people can't even click around your prototype, they're not going to go This is awesome. As I like to say, it doesn't matter if you have the world's best functionality. We have the world's best search engine, but guess what, nobody can figure out how to use it. It doesn't they don't get credit, or you don't get customer guide. And there's a great framework that I like to use an MVP is, you know, as a podcast, you can't show an image. But it comes from Aaron Walter was the head of UX design at MailChimp. And I remember when I first used MailChimp, it was like head and shoulders way better UX than all the other email campaign programmes that are out there, right? I mean, it's just so easy to use, and had a funny playful tone. He logged in it's like hey, you're looking good today Dan, all kinds of funny stuff. So anyway, he wrote a book called designing for emotion. I'm not surprised that he wrote it given what I saw with MailChimp. And in that and then you know, I someone else adapted that designer named Jesse percent and adapt It is framework. And then I adapted this, I just want to give those guys props because the original idea was in my, I just kind of adapted it. The idea is you've got a pyramid, there's like a hierarchy of four, four things. And the bottom of the pyramid is functionality, how functional is the product? What does it actually do for you? The next layer up is reliability. Like, is a work as expected as a buggy? You know, what's the quality level? The next level up is usability. Okay, can I actually use it? How easy to use is it? And then the final tip of the pyramid is delight. Right? And so they goes delight, usability, reliability functionality, top to bottom. Now one of my clients got tired of writing those out. And so these put d u RF. So we've named it the derf Pyramid, for sure, basically, but the whole reason of that pyramid was you know, about 15. year plus years ago, everybody got the memo? Hey, usability matters, right? For a while people like yeah, whatever, they have to use it. It's b2b software, or whatever. But now everybody knows. Now not everyone's still building great UX is but they can't say they don't know that they should at this point, right. So the whole point was to elevate the discussion if usability answer the question, can use the product, delight, answer the question, how do they feel when they use your product? Do they want to use your product? That was the point was to elevate that, right? And so back to your point, you know, if you've got bad usability, it's going to help you, it's going to prevent you from getting a true read on how what's the product market fit with the functionality that we've built, right. And I had this happen firsthand, when I had my startup, we were launching at TechCrunch. And it had to be a private beta, because we had a launch at the event. And so I'd bump into someone at a coffee shop in Palo Alto, and they be like, Hey, what do you do? And I'm like, Well, you know, I'm doing startup, like, oh, cool, can I come check it out? And I would always say, yeah, you can check it out. But the price of entry to our private beta is you gotta let us watch, you go through the news or flow and sign up. And you know, I try to do the best job I could on the on the features and the UX. But in the first 15, or 20, tests, we ran into some issues and usability issues. People like I don't understand what this term means. I need some examples. We found a few bugs, you know, we just improved the new user flow. And so then, you know, we had a small team. So we did it very quick. Because after about a test 2025, those issues went away. We weren't hearing those issues anymore. So then I got a little confident how cool these tests are going pretty good at the end of tests, or ask people, would you use my product? would you use this product and they just do this session went great, there were no major complaints or concerns. And to my surprise, about 15% of people, even though the test went really well, we're like, No, I never use your product. But I was just like stuff like, What? What do you mean, like, you didn't complain about anything during the day. And what it was is, it turns out to make sense that different people like to get news in different ways, kind of like a philosophical approach to how you'd like to get your your news met, right. And it turns out that unbeknownst to my co founder and I, we had kind of designed it for the way we like to get news, which Lucky for us was about 50 60% of the market, roughly based on our rough research, after we learn about this, another 20 30% of people like to get a different way. But luckily our design work for them. But that last 15% of people, they just there's no way our design was gonna work for them. Right. So that illustrated to me, Randy and Emily, like, there was no usability issues we had in there after 2025 tests, we've eliminated all these ability issues, they were getting through the flow, getting to the main product using an understanding of fully, and then not wanting to use it. And so for me, that's the difference between usability feedback and part of our feedback. So yes, bad UX can get in the way of testing that. And you need to kind of iterate your way past that, you know, to get there, Lily Smith this semester, kind of back and all of that. But I just want to go back to a question which I kind of mentioned earlier. And what are some of the kind of the good examples of MVP is kind of really shining a light on the direction to go in? Dan Olsen Yeah, thanks. I just want to provide that context on the different types. So the two, the one I talked about the most in most in the book, and the workshop is a project I worked on called Market Report Comm. And this is great, because the startup CEO, they had an existing product. And he had such a small Dream Team. He said, we can code anything, we got to use protests, I'm like, awesome, we're going to use prototypes. You know, we did the whole process and came up with what we thought and honestly, gave me an idea for the product was like the CEOs pet idea of this idea of a kind of a marketing report, you know, to so like, Why do you get the junk mail, the marketing mail that you get, and it was gonna have like a marketing score, like, like your credit score, so he did your thing, your credit industry? So he's like, making the jump of can we do something like that happened in the credit industry for marketing. And so we, you know, went down the steps, and we built a vision prototype, you know, and we tested it. And there were tonnes of questions and concerns, like people, people were not excited about it. Right. And we had tested a few different features besides the core thing. Luckily, we tested two secondary ideas, and the secondary ideas kind of resonate. I mean, the test did not go well. Again, there was so much confusion and comments and stuff people like now I'm not gonna go for this. But the the main idea the CEO had was I did not resonate at all that was kind of unsolvable if you will. But the the secondary ideas, even though there are a lot of questions that currently felt like questions and concerns, we felt like hey, We feel like we have a good handle on this if we pivot, and we only focus on this one benefit, and we take all the feedback we have, we think there's an opportunity for customer base. That's what we did. So with one pivot one iteration, we completely change the concept, refocused it much tighter scope, like the old first MVP, maybe five different benefits or features. This one just had like one or two. And it went really deep and applied all the learning that we got from the research. And the second time around, it was like night and day, people were like, there were still some questions and concerns, but they were much smaller. And both times we asked people, Hey, would you pay 10 bucks a month for the service. So also, if you'd ask people if they would pay, they don't actually have to actually pay you. But it was like a night and day difference. And then also, the funny thing was, you know, the way these sessions work is you say, hey, Randy, come, come spend 45 Minutes with Us, we'll give you an Amazon gift card, or we'll give you a check, you know, for 100 bucks or something. And at the end of the test, people usually take their check, I'll be done. Okay, cool. They take their check, and they run. But at the end of this test, everyone's like, they stayed there. And like, Oh, so is this live now? Can I go use it. So it was really exciting to see that with very attention, careful attention to detail what we learned from customers and pivoting iterating, which again, we did in about four weeks, because we were doing with prototypes, and we had no qualms throwing the original prototype out the window, and starting with a fresh slate. If that had been live code, I don't think I think we had a little more sunk cost fallacy going on there. Lily Smith Do you kind of often take different variants of prototypes, like to customers, because that's when you when you're kind of early stage with a with an idea or concept, there's often like a few different ways that you could approach it? Dan Olsen Yeah, that's a great question. You know, it'd be easy again to say, yeah, test everything. You know, it takes time though, right? So you don't want to kill yourself with too many options. Um, so the reason the main reason we did that I kind of skipped over it is the original feature set that the collective team at brainstorm had like, eight or nine features in it. And when it came time to get to the MVP stuff, I'm like, I don't want to put all eight or nine. And that's like a kitchen sink MVP, with everything that's like trying to throw as much against the wall. So what I did is I said, Let's take the core idea of this marketing score stuff and combine it with these logical group have two other features. Let's take that. And that's one MVP. So it's all about feature scan feature set definition, right. So like, that's the scope for that MVP. And the other one is to take the core idea, plus these other ideas that's been rescinded, there would have been like, way too much in a single MVP would have been all over the place. But to your point, you know, if you think about it, you know, you might, you have to decide how far down the pyramid Are you gonna go with variants. So one easy thing to do is, like we've just said is not very good online on the feature set, we have two fundamentally different UX design approaches that we can do, right? We can do, we can approach it with this kind of a model for UX design concept designer this way, that's a great point to just test, mock up both and test both and see what resonates. What an extreme version of that I did recently, for a client, it was awesome with e commerce client, on all these ideas. And so like, I created, we probably had like 12 different variants of this like shopping pane that you could have in your browser, right. And in variant one, we focus just on like this would believe, but a minimal one would look like they don't ask you for your few things. Here's what the next idea is I could write. And we also had ideas about what when would you transition to the website, and we had different ideas there. So that was pretty cool. And then what you can do is kind of on the fly after a few tests go, Okay, this isn't working, we're hearing this let's riff and iterate. So I've done that rapid iteration, where in the test, you know, after a few tests, you're coming up with a slightly modified concept that you then test with people again, right, so Um, so yeah, I do think modelling different UX can be helpful. It's as long as you're not doing it with like, trying to throw as much spaghetti against the walls. I'm just very like, discipline. And like, here's why we're testing these five or six. That makes sense. I'll give you example, one that fails, so that when it started out my doing well, but we pivoted to success. One word we just didn't pivot to success was a startup, it was a while ago, and the startup didn't go anywhere. And I won't say who it is, I'm comfortable talking about it. But it basically like the social finance idea of like, hey, lenders, you know, lend to you based on the data they have, why don't we try to tap everyone's like social networking data, LinkedIn, this then and Facebook? And somehow, you know, use that too. And it was a good mission, like for people that couldn't otherwise get credit? Could we somehow tap into the data available online about them to get them credit? Sounds like a good idea. innovative idea. So we brainstorm all the features. We did some mock ups, and then I tested people, and they were just really really uncomfortable with giving their like credentials to give you can't there was enough like question mark about even if we get the data, how much machine learning and what can we actually do with the data. But we found out to our surprise, you know, we, you know, obviously there's different privacy segments, but even the people we tried to like screen out for like not being too concerned about privacy, even they were still concerned. And so we realised, hey, this isn't the fundamental presumption here. You know, again, going back the risky assumptions or people even let us get this data for them? It turned out that didn't work out. So yeah. Randy Silver Taylor could talk to you about this for hours and hours. But I have one more question. I wanted to make sure we hit most these examples. Were there you we've been talking about are at a fairly early stage in product development. Is that the only time you should be using an MVP? Is it only at New that Inception and discovery stage? Or is it a tool that you use in a continuous way? Dan Olsen Yeah, I wouldn't use them in a continuous way. Because I mean, basically, it easiest to envision how you could use it in the one right, if that's what I keep going back to, but like, you know, big companies that have existing products are constantly adding new features. And doing new big project means that maybe they're launching a new product, maybe just prior, but even when you do a new product, right? It I guess it goes back to what's the risk and reward here, right? So as I like to say, if it's like, Hey, we just have this bug that we know what is needed to fix it, you know, do we need to go test it and do an MVP and prototype But no, right? So it's kind of related to the level of uncertainty you have in your assumptions, and what the risk of being wrong is, right? And And that, again, if you're pro version, do we have to build it and then test it? Well, then the cost to test it is pretty high. So then the bar for, for how much uncertainty and importance it's going to have to have is higher. But if you build this rapid prototyping muscle in your organisation, then the threshold becomes much lower. And it's okay, before we launch any new functionality, why wouldn't we create an interactive prototype and show it to 10 users, I guarantee you, when you do that, you're going to learn and find stuff that's going to make that feature better, you can prevent, you know, like that social financing we talked about, you can prevent things, launching things that aren't going to make any difference that people aren't going to use. They can give you improvement ideas, right. So and you have to as I like I said, you have to do the UX design Anyway, you got to tell the front end developers what to build. So it's not really that much incremental effort to say, hey, let's go run this by some users before we put it into user stories and, and ship it, you know, give it to dev to build, let's just go test. Right. So, you know, I think you can use it, obviously, again, if the scope is so small, and the uncertainty is very low, then it's not. You don't need to do it. But you know, otherwise, I think it's a valuable skill to build in your org to be able to do that rapid testing, right. And some of the plays one of the biggest challenges honestly, is great, Dan, we're up for being customer centric, refer being lean referred doing this. Okay, we got a prototype ready. Okay, Randy, go go line up 10 users now? It's like, well, dude, is it gonna take me like two weeks, find some users. And by then the trains left the station, right. So it sounds so mundane. But literally, it's finding and scheduling the users is one of the top obstacles to people doing this more. And that's why, you know, the best solutions that I've seen are, you know, unfortunately, PMS, if our job description is never ending, it's always stretching out. It's like, you know, if no one else is there to schedule these interviews, it's gonna fall on us, and we're might be too busy. So you see people that are like, research, you know, associates or research operation, it's not a tough job, yes, I find that you have to find the people and schedule them. It's really not that, you know, it's just kind of that tougher job. So you can have a part time person, a junior person, right? Um, do that. And I think the best way out of the trap is, instead of waiting till you have the prototype ready, just have that person schedule, like, you know, six users the first week, or every month or something. And yeah, you don't know at the time that you schedule them what you're gonna ask them about, but you'll know that you'll have them ready to do them, right. So that's the kind of the way all the jobs, I see people that are all for it. And they go to try to schedule it at the moment. And then it's too late at that point in time, Lily Smith then, thank you so much. I was gonna try and sneak in one more question, which was, what was your top tip for few more things and take on MVPs in that process, but you I felt like you're kind of covered it with that tip of like, make sure you get your users scheduled and as soon as possible. So unless you've got anything else you want to add, Dan Olsen when you can add on greetings. I'll tell you the, the top two mistakes people make right. The one is even it's ironic. The first is they over scope, their MVPs. And it's ironic, because the whole point of an MVP is to not over scope, whatever milestone you're planning, right? But because of that, gosh, it's not good enough. Gosh, you know, we didn't get into it. But obviously, you know, some key stakeholder can be like counting on the fiscal and I can't believe we're gonna launch this product without feature x. I think IBM or Google or insert big company name here is gonna freak out, you know, you get a lot of that stuff going on. And then the poor product team just says, okay, and they had a feature and the next day and other stakeholders says the same thing. The next thing you know, you've got like in a bloated MVP, so it takes it's very tough to have the discipline. And I would just say that's where wireframes can help you out. If you're having a tough debate on your team, instead of just biting the bullet and putting the feature in and delaying your launch. Once you say, hey, let's take a timeout, let's do a wireframe and see if people complain about this feature missing. Now, that's the first mistake. The second biggest mistake, we kind of touched on with the derf Pyramid, right? With the delight, these ability, reliability, and functionality. The whole point of that pyramid is a very famous picture online, where it says it shows how you slice the pyramid. And it says how not to slice it is to not just slice it horizontally across a subset of the functionality. What does that mean? That means, hey, we're launching a subset of the functionality which follows the advice I just gave, which is great, but it means you're kind of ignoring reliability, usability and delight. And, and the way that goes down is, yeah, I know, I know, gosh, I know this got some bugs. But man, we've got we've got to ship this thing on March 31. It's just an MVP. We'll fix the bugs later. Right. By the way, it's amazing how many software packages shipped on March 31. And October 31, these magical dates at the end of the quarter, right. I know we should have done some user testing, you know, we should have done more someone UX design, but Gosh, we've got this deadline. It's just an MVP. So the users, it's just an MVP as an excuse to like, not worry about reliability, or usability or delight. And then when you pass those fees, how do you think they do back to the previous point doesn't matter if you have great functionality. If the UX is bad, or the reliability is bad, it's not going to test well. And that's when usually people go I'll see this whole lean MVP thing doesn't work. Let's go back to waterfall. So instead is true, you want to build a slice, but the slice is like a vertical slice a thin vertical slice that picks up some of those top layers. So whatever subset of functional you do bite off, it's not going to be perfect, but it's going to be reliable enough, usable enough and delightful enough to truly test the MVP concept. So those two more tips for people on the way out the door. Lily Smith Thank you so much. That was fantastic. Thank you, man. It's been great talking to you today. Randy Silver Hey, you know, this whole reminds me Lilly, you know, the original name for this podcast was going to be minimum viable podcast. Lily Smith I never It was terrible. And I was never going. And the testing we did around it validated the decision. Randy Silver Yeah, fair enough. It totally did. And you know, the nice part is we didn't even have to build any mock ups to learn that lesson. Lily Smith So true. So if you like this podcast, that's not called Minimum Viable podcast, and then like and subscribe and leave us a review. We haven't had one for ages. haste, me, Lily Smith and me Randy silver. Emily Tate is our producer. And Luke Smith is our editor. Randy Silver Our theme music is from Humbard baseband power. That's p au. Thanks to Ana kittler, who runs product tank and MTP engage in Hamburg and plays bass in the band for letting us use their music. Connect with your local product community via product tech or regular free meetups in over 200 cities worldwide. Lily Smith If there's not one Nagy you can consider starting one yourself. To find out more go to mind the product.com forward slash product tank. Randy Silver Product tech is a global community of meetups driven by and for product people. We offer expert talks group discussion and a safe environment for product people to come together and share greetings and tips.