We've talked about the importance of good user stories, both as providing a quality anchor point for your work with an agile team. And as kind of a signal to call it out, when you really don't know enough to go off and build something and what you need to do is just spend a little bit of time going and doing some discovery about your users. And we talked about specifically how, every user story should thread back to a persona and a job to be done. Each one should have this testable reward that we know is important to the customer because of our proposition and because of our propositions relationship to one of their jobs to be done. So, no problem, right? You just go off and do that. Well, most of us find, even if we're single person startup or we work at 100,000 person company, it's always hard to find time to do research. You will hear, for example, from your stakeholders, things like this. Well, we that'd be great, but we just don't have the time now. I think we already got some clip art and made some personas. We just gotta get this done. We don't want to make a science project out of it. And there's a perfectly legitimate sentiments behind those things. They probably had bad experiences with research that wasn't economically very relevant, that wasn't done well, that was maybe big research upfront. And they're a little gun shy, is what you'll probably tend to find. And yet, the statistics are not in our favor, if we believe that a lot of the waste happens because, we don't know quite enough about the user. So, assuming that we subscribe to this idea that we should be doing just enough work to make sure we know something is important to the user before we spend geometrically more time and money, building it in software. Well, let's talk a little bit more about these research methods within continuous design. We've talked, there are three principal ones, really what we've talked about so far and what we're kind of focused on this particular course is this, question of how we go out and do subject interviews where we have a subject, we screen and we do these subject interviews. We asked them for instance, what are the top three hardest things about completing these HVAC repairs? We don't say anything else, we don't tell them. We wouldn't necessarily tell them we're working on a piece of software and we see what we hear, that's the instrument you're learning about. That's the instrument we use for this and you don't make a big deal about it. You can take an afternoon, you can take a day. It doesn't have to be a weeklong sprint. And you should treat this though as a hypothesis. It's not like you make a persona and it's like laying a cement foundation. It is an asset that answers questions that you had at the time. And it is perfectly normal and perfectly predictable that you're going to run into a situation, where that research and those personas you created do not answer the questions that you have at the moment and that's fine. It's just a signal that you should go and fresh them up. Go get those answers that you now know you need to have. Here, we'll just quickly recap with the demand hypothesis we're saying, do we have evidence that the persona wants this? How do we use MVPs and the tools of lean startup to go do that. That's something we'll spend a lot more time on in the next course in the specialization hypothesis driven development as well as the course after that, agile analytics. But essentially what we're doing is creating experiments where the user is given a choice and they make the choice that they're interested in that proposition. I'll show you an example and a counter example of that in a second. And then, finally, we have this usability hypothesis where we do single subject usability tests, we're not worried about the right problem. We're not worried about demand. We're specifically focused on answering the question of, can the user use this software. All of these things throughout the specialization, you will learn how to do in an afternoon and you can get practice enough where you can just pick up and do just enough research to answer the questions that you've got. And hope you'll get comfortable with that habit and comfortable with the idea that good design research and good customer discovery, isn't something that holds up to any possible question that might come up. It's something that's just enough research to answer the questions that matter to you right now. More is too much, less is too little. Here's a really interesting story, Phillips had a focus group I believe, where they had young people and you look at their new boombox designs. And they had a new yellow one they were thinking of making asked, hey, do you guys like this yellow one? And they all said of course yeah, that sounds great. I'll have my $30 now for participating in this and I'm going to go home to watch whatever young people watch on TV nowadays. What they found though was that they also offered them a boombox, they could actually take one on the way out, sort of interesting and you know what happened? They all took the black boombox. So, why is that? Was it that they, why did they lie to this moderator? Well, this is the reason why when we look at these different research methods, there is a difference between what we do here and what we do here. If we just went and did the subject interviews and we could ask the subject, now that we understand your problems, would you like to buy one of these and get a reliable answer. That would be so much better, it really would, but we can't do that. And so this kind of yellow boombox story shows us why we need to do those things. And so, I would think about your demand hypothesis as distinct from the material you're learning about now with subject interviews. Later on, the course will focus on this. But one way is that, we always want to frame these demand hypothesis like this, if we do something for the persona then they will respond in a certain way. And instead of just asking them, would you want this as a hypothetical? We have to present them with a real choice and that might mean putting up a Google AdWords and seeing if somebody out there on the internet clicks on it or not. It might mean in an enterprise context saying, hey, well, if you think this is really important, what do you think about prepaying for a bunch of licenses or something? But we'll learn more about that in their resources at the end of course, but it's different than the work that we do here where we're talking to subjects also different than the work we do. We're doing usability testing, which you'll also learn about later in the specialization. So the punchline is, learn how to focus your research and answer the questions you have, acknowledged that, just because you did some research before, that doesn't mean it's going to answer all the questions you'll ever have and that's normal. And try to make time to do it. It doesn't necessarily need to be a lot of time especially, when you get more practice to doing this. And a couple of course resources, if you feel like, I want to know more about this now and that's fine. One, you can see a link here to a talk called Just Enough Research by a woman named Erika Hall. It's an excellent talk, essentially about this very topic. And then the second link is to the customer discovery handbook that I published. Which has kind of a cheat cheat, a set of cookbooks about how to do these three different types of inquiry, depending on where you are in that continuous design process and what questions you have at the moment.