The last method we looked at was, how do we go out and learn about who our customer is, what kind of shoes they wear, and what's on their A list in this area. Now,, I get the question all the time and I kind of observe the area all the time. Why don't I just go ask them what they want and they'll just tell me. And that will be a customer centric way of building, just the right thing. And unfortunately, you can't do it. Just doesn't work. Generally what you get, is a false positive. They'll say sure, that sounds great. But they're A trying to avoid the awkwardness of telling you that potentially that they don't want it. And B, they really don't know what their future behavior is going to be. Particularly, if the thing you're proposing is kind of new and not something they've used before. However, that's the bad news. The good news, is that there are really terrific methods and a really nice body of work and case studies and examples of how to do this. And how to test basically this demand hypothesis that we can we can link as you saw very tightly to these. And that's this body of work around lean startup. Essentially, the question is, how do we use these minimum viable products, which you may have heard of MVPs instead of a 1 dot O. How do we build something that's a proxy for the product experience or the products proposition, to test this demand hypothesis before we do anything else in order to maximize wins by minimizing waste. If you've heard of lean startup, read the book lean startup, which is very good, you may have seen this build measure learn process. And in fact, the interesting thing is that this is all it sounds good and it works well. You're actually supposed to start with learn which is one misconception. And then the other thing I like to do just because this is the framing that we're using this hypothesis driven approaches. I just like to unpack it with the plain old scientific method. So, in that sense, we're going from idea, we get ideas and then we texture them out by having a strong persona job to be done hypothesis. And then we form a demand hypothesis, we'll talk about that in a second. And then, really the beauty of lean startup is around the lessons learned in how to design experiments and run these experiments in really tiny batches in a few days or a week. So, that you can drive to these moments where you say, nope, this is just weird, this is A false, this does not look promising. We might want to re-tune it and retry but we have evidence here that this isn't the place where we should be investing our time and moving this into the product pipeline any further. Or and so, or alternatively if you get this persevere evidence like yeah this looks really promising. They can go into that with confidence and more focus and more understanding, specifically what's working, why. And you can bring that evidence to your team, it's going to help a lot to focus. One of the most crucial things for getting started with lean startup, is having demand hypothesis in our particular framing of this continuous design stuff. And so, the way I like to start by doing those is always in this kind of syllogism. If we do this certain thing for this certain specific persona then they will respond in this certain way. This is nice and when you put these together, if we build software for HR managers to screen employees before they go further in the hiring pipeline then they will try it out, use it and buy it. Well, it sounds so simple, so obvious, but most good design things in retrospect, they look that way and they seem that way. And so, that is the essential body of work around lean startup is. How do we form these hypotheses and then pair them with experiment vehicles MVPs. Here are a few of the most popular MVP archetypes. And we'll go from most observation but kind of least decisive evidence to essentially the opposite. Let me show you what I mean by that. With a concierge MVP were hand creating the customer experience. For example, with the HVAC in a Hurry team, they might go and shadow the technicians and just do the job. Be kind of their assistant for ordering parts, just to learn about how that process happens. Or if you're going in and you're building enterprise software, something is boring, is processing a purchase order or something. You can build kind of an infrastructure where you can help people do that just to learn about the process and how it works. The Wizard of Oz test, these are really popular in robotics and voice interfaces places. Where there's a lot of engineering and data science that has to go into really operationalizing something. We create the experience, it's a little different than the consolation that there's the idea with the Wizard of Oz, there's a fake UI of some sort like a robot that talks to somebody. But instead of it really being automated, there's a man or a woman behind the curtain actually operating that thing. So, this is a way basically providing the target user experience but not coding it out and making it actually work. So that you can learn about how users interact with it before you overdo that. And then finally, the smoke test MVP, this is probably the one you've most likely heard about if you're familiar with lean startup. This is where we sort of pre-sell our idea before we go build it. For example, if we we might run a Google Adword test, we run some google add words and see if people click on them as a way of assessing. For this person looking for this thing, do we have a proposition that at least interest them initially and gets their attention? Dropbox famously used this with the demo of their product to drive sign ups, pre-sign ups for for their product, brought that evidence to go raise money and execute. Kick starters, Indiegogo, for example, these are really in a way, their own little smoke test. Because you have this thing that you want to build for this community of interest beekeepers, let's say. You put it out there, you see if people want to preorder these things and chip into this this campaign or not. And so, I would say that's an example of this smoke test MVP. To me, the most exciting thing about lean startup, is this idea that if we can eliminate the ideas that aren't meant to be earlier in the process like this. If we can just, let's say, double R value for SFD. The the amount of features release content we're building that really matters to the user. Then it's like we're doubling the amount of capacity we have in the product pipeline for the same value of big F, which is pretty exciting. Now, you might ask, well if all this stuff is so great, why isn't everybody out doing this? Isn't that a kind of a counterfactual that this really doesn't work that well? And I would say, it's a good question reasonable. There are a few reasons. One, is that there's this fallacy of well, I have all these developers on hand and if I don't use them, I'm going to lose them. Or if I don't use them they're going to waste. And then, you start building something everybody gets over invested in that. And then you end up down this path of building something that nobody wants. I mean, that's one way that that happens a lot of the time. Another thing is that, prototyping in the sense of building something that looks like a finished product which may or may not be a good vehicle for your MVP. And it could be if you're doing a Wizard of Oz but it's really not. If you're doing a concierge or a smoke test, that's another thing that kind of draws people away from really testing. And it makes them feel like they're doing kind of an MVP but really they're just going and kind of starting to work on their one dot O. And they haven't taken the opportunity to eliminate the ideas that aren't going to be winners. And then finally, it's kind of weird running these experiments. Particularly, the concierge and the Wizard of Oz, they don't come in nice tidy packages. You're going to go out, you're going to do them, and they're going to be kind of messy. And the evidence you bring back, it just it doesn't show as well as a fancy prototype or something. Just evidence as simple as we shouldn't build that. Well, that's actually an amazing thing to get bring to your team. But it doesn't look good. In a in a weekly review, our management meeting too, people are familiar with these methods. So, those aren't reasons not to do this. Those are things that you have to surmount to make this work for your team. We will look at a lot more specific examples of how to do this if you're joining me for the specialization. However, if you're really interested in trying one of these out, here again, there are tutorials templates, examples, resources, in the course resources if you want to jump in and get started with this. But there are some ideas about how to test your demand hypothesis and maximize wins by minimizing waste.