This week, we're going to focus on user experience analytics, meaning, how do changes to the user experience in our website or enterprise or software or mobile app, how do they affect user behavior? How does this relate to the material that's come before? What's different between this and what we've done before? Let's talk about that for a second because all the things that we've done so far are a foundation for this. When we talked about customer analytics, what we're really looking at is the big picture of how we're delivering on our persona and problem hypothesis with our value hypothesis. So we zoomed out and made sure that we are looking at the big picture of, are we solving a problem, delivering on a habit, dealing with a problem that actually exists, and is our proposition better off than alternatives? When we zoom out look at the bigger picture so we don't get lost in the weeds, how do we make sure of that? That's why we've looked at that table where you're going to go through different parts of the funnel and we think about big picture, what big independent variables do we want to test? Now we're going to get down into the details more. How does this relate to the last week where we talked about Demand Analytics? Well, here with the material around lean startup as you saw, some of those tests we are just looking at behavior in the field, but really there we were looking at how do we pare away motivation and isolate it and test it separately so we can get clearer, more clinical results that we can understand the relationship between these things as we're going out and we're just observing users basically from a far where we don't get the benefit of isolating and testing these things specifically. So this week, we're going to be focused on motivation, we were focused on motivation last week, we're going to mostly focus on this, but we're also going to focus a lot on just how do we understand stuff that happens out in the field with a product where we're not sitting with this user and we're not able to understand the specifics or create an experiment where we're able to isolate one of these things. So we're going to focus primarily on usability, but we're also going to focus on field analytics where we're just observing action or inaction on the part of the user. The centerpiece of this, the good news, for Agile lists is that the centerpiece of this work is the user story, well articulated user stories that have all three clauses. We'll quickly briefly review those and we're going to see a lot of examples of them, and there are references in the course about creating good Agile user stories, what that means, and how it relates to testing. We're going to recap if you join me for a hypothesis driven development, we talked a lot about qualitative usability testing. We're going to briefly review that and the relationship between it and prototypes, and then we're going to be primarily focused here on application analytics. So how did the live analytics that we put into our product where the 1,000 users that we don't get to actually talk with, how did we pair that with the other things that we've learned so that we can make good decisions about where to focus and what to change and not change with the product? One of the most powerful tools for dealing with this, I think in helping your team keep the big picture in mind while giving them the space to focus on the details of what they're doing, is this story map where we've got our qualitative hypotheses, our persona problem hypotheses, our user journey up here. Then we've got the big picture, the dumb questions, if you will, the obvious questions of what we're trying to learn about the user journey at each of these steps. Then as we dive in, all the little details we need to attend to to make all this stuff happened and work well and good collections of details and thoughtful executions are how you get to a good product, a good user experience. How do we pair that with the big picture? I think the Story Map is a really great way to organize your thinking about how this fits with the larger picture of what you're doing. All right. So let's get into the details of how we run a good agile program around UX Analytics.