One of the things were after this week is how you can sort of zoom out and look at the big picture and parse it into pieces that are workable for your team. So you can focus it one piece at a time and do good work there and yet, still have everything on the table and properly prioritized. And also be able to relate what you're doing at any given week moment, hour to the larger picture of what you're trying to have happen with the customer. To that end, we're going to look at a few different frameworks at for how we go from user doesn't even know our product or feature or infrastructure exists. To happy habitual if it's a product monetized user of what it is that we're working on. All right, so we're going to let the first framework we're going to look at is AIDA and that stands for attention interest desire action. And then I've added these two steps onboarding and retention. Fun fact, this is one of the oldest business framework still in use today. It's from the 19th century, from it one of the original marketers or Proto marketers and it had these four steps that you see. And then I added onboarding and retention here because they were selling soap and snake oil and things and they didn't have this idea of onboarding people to a platform. So let's kind of talk about what these mean and how we go from somebody that that doesn't even know that our our thing exists. To happy habitual user of our product or feature or infrastructure. The idea is that attention is where we think about how does this user go from not knowing anything about our product or feature to knowing about it. And so we're asking how does this, how do they even find out that our proposition exists. And of all the noise floor, of all the of all the things that are out there, how do we break through that and get their attention? In a way that's relevant and helps us transition to the Second Step interest, what engages them with your proposition? So, for example, we might get somebody's attention with an adverts ad or a Facebook ad. And then we might engage in let's say that's the first 500 milliseconds to one second of interaction with this person. And we get their interest with a landing page or some kind of focal explanation of what we're talking about that gets us there their engagement for the next four to six seconds maybe. Desire is not directly measurable. But frankly, that's kind of what I like about this framework. Desire is this sort of checkpoint that triggers the team to kind of zoom out and ask what are we doing about this job to be done? This problem, this habit that this user has where they're going to be interested enough to try out our alternative relative to what they already have. And and what's going to motivate them? What emotional resonance do we have with that moment where we're getting them over that kind of energy of activation to go to the next step which is action? And really, the question here is what are the minimum set of things they have to do to try out our product or feature? Whatever it is and get some kind of early initial reward some kind of validation that it's worth spending their time on that. It's better enough than their alternatives. And how do we layer in sequence that in a way that that gets us downstream here to onboarding. Which is really the minimum set of actions to deliver a substantial reward relative to one of our jobs to be done or problem scenarios. So action is kind of how do we get them to this point? How do we get them to sign up for a trial, for example? Or how do we get them to put data into this feature so they can see how it works? And onboarding is how we layer the next step and minimize the the set of things that they have to do to get some kind of early initial reward from doing this. And then retention is frankly kind of a catch-all for everything that happens after that and there's a lot of variation there. And I think a good question, a good pair of questions to ask is what is success for this user? What does that mean? What does that look like? And how are you going to measure it? And what does that also mean for you? So for example if you have a paid engine of growth, is their lifetime value enough that it justifies how much it cost you in paid channels or advertising investment to bring them in and monetize them? If your product is fundamentally viral organic and you're relying on word-of-mouth and people talking about how great your product is. Are they sharing, are they helping you broadcast your message organically? And if your primary engine of growth is sticky or scope. In other words, we're going to work really hard to have a long broad customer relationship where there's a lot of transactions. Then is that happening with this customer over time? And so what, retention might mean, what's a success for them? What's the success for you is the next question and that's what I would be asking. Again, the idea with this is to just get all this on the board. The idea is not to be simultaneously working these things all the time. It's to have a nice coherent idea about how you're going to parse this whole thing out and then be able to dial in and focus on one thing at a time and do it well. Let's look at another framework for little bit simpler that I think is maybe a better fit for say internal IT projects. One simple way to talk about these things is that we have a 0-30 and 90-day criteria. O-day is can they use it? Is it usable? And in particular, if it's an enterprise software project, for example, I strongly suggest that that what you make this mean is can we give them real actual sample data from their work? So for example, if it's a CRM, tell them about a lead, give them a little input and see if they put everything into the fields the way you expect an intent naturally without you prompting them. Or if it's a purchase order system or something, can they do the same thing? So rather than just being too overly simple, make sure you exercise these systems with real actual sample data from real plausible things that they're actually going to do. And see how it works out before you push it out to everybody. 30 days is are they still using it? We onboard them the way we think is going to work. We leave them alone for a bit and assume that they're going to be engaged with this thing that's going to work for them. After 30 days, how do we observe whether they're still using it or not? Whether they've created habits and if they haven't, then we need to fix that. And then finally 90 days, let's say our product is supposed to make processing purchase orders easier or make sales people more productive. Is that actually happening? Is the underlying promise of our proposition getting delivered on? Because if it isn't, then they're what we might see is they used it on the 30-day mark and then they kind of taper off. Because the things not really delivering the way it's supposed to and that's what we should attend to. So those are two ways of kind of zooming out and thinking about the overall arc of the user experience and how you get from zero to kind of successful place with your user. So you can get those things on the board and think about them with your team.