Once you're at the stage of trying these things out, this is where you need to collect data. And of course this can be test scores or quantitative feedback. But I also think it's incredibly important to just dedicate the resources to observe what is happening in these classrooms. It's literally about getting another teacher or a principal or even a video camera if you have no other choice, to really monitor and watch what happens, because you can learn a huge amount if you study it closely. But typically in education, we just throw a bunch of stuff on the wall and then later try to remember what we thought worked or didn't work. So my best advice is to just commit to having an observer there who can really watch and process with you, because you can learn a huge amount from every one of these trials. >> One other thing to that end, which is don't forget about feedback from students themselves. Summit Public Schools does a great job of this where they use focus groups and, and regular surveys to collect feedback from students about what is and is not working. >> Understanding student voice and understanding student experience, it's not just to kickstart a design process. It is the absolutely the blood that pumps through every aspect of the design process. Like, you get what students say, then you prototype and you create an idea, and you put it out there. Then you have to hear what they have to say again, and what they think about it. And that is what you use to then continue to iterate for it. The student voice is really the engine of the design process. And for people who are nervous about taking a leap into the unknown, and are nervous that this you know might be nerve, nerve-wracking right? To like maybe, perhaps stray away from what you're used to and to hurt students is what some people think. But really at the end of the day when you put students at the engine of the design process, like, you can't go wrong because you're always coming back to them and they will hold you accountable. At the end of day to a higher standard than you ever could to hope for yourself. >> The reason you need to keep iterating is you will not get this right, right out of the box. None of us are actually smart enough to design the perfect model on paper and you need to de-risk this for yourself and allow some failure. But that concept of build, measure, learn, will you let you keep the, sort of virtuous cycle of innovation going, and you will get to better results. >> so to understand what iteration looks like in practice, let me give you an example. When we started and launched the idea of a playlist for students who are self directing their learning, They would go to a playlist and be able to select how they would learn before they would then move on to, show what they know. our first playlists were sort of based on what you would find on your iPod. And so we just put a whole bunch of resources together and gave it to kids. Well, it didn't accomplish what we wanted the playlist to do. Honestly, kids weren't learning that way. And so we had data that said they weren't learning from the playlist, and so that's not what we wanted to accomplish, and so we had some ideas about how to improve it. And so we took those ideas and we iterated on the first version and tested them, and said, well, you know what if we take the playlist and we actually divide it into sort of groups. And group the resources around a header that says, you know here's an objective that you want to learn and here's some resources around it, would that improve? And we did that. We gathered student feedback and we heard their voices. We looked at their performance data, and so they started to learn a little bit more, but we learned, and then we went through the cycle again, and again and again, and each time got better and better. So today the playlists are significantly improved. They have introductory sections, they start with a diagnostic assessment where kids can really see where they are, they conclude with a direct link to the final assessment. They have ways where kids can mark what they've already done and keep track of their progress, and they allow kids to crowdsource their feelings about the playlist. Did it work for me, did it not? And give their peers a whole set of rankings about what's effective or not. So, that's a, an example of going through the cycle multiple times using the measurement, the learning, to continuously iterate and improve to a place where we feel really good now. >> In school settings, there's actually an added challenge. Sometimes you have a great theory or a perfectly constructed idea, but when it hits the reality of real schools and real students, it all falls apart. Maybe the internet's down one day. Maybe a student had something traumatic happen at home and they come in and they ruin the lesson for others. And it's really easy to throw the baby out with the bathwater. But sometimes you just need to do a different iteration or actually stick with something through the difficult stage while you're learning how to do it even more strongly. >> So implementation really matters. And this is where Brian and I would say you really have to trust the gut of actual educators on the ground about when it's worth doubling down on something or when you actually have to step away from something because it's not working. >> I have a, a friend who ran a network of schools here in California. And they tried a really thoughtful pilot of a new piece of software, and he did it right. He'd have a small group try it, he measured the results, and they actually got very big gains for their students. So then it was clear that the right thing to do is, let's scale this, let's put this into all of our schools. And they shared the data and they made a plan, and they rolled it out and like happens he turned his attention to all of the other parts of the job, and a couple months went by and they got the data back and actually hadn't had very great results. So the first thing all the other people involved said was see, it's not a good piece of software, it doesn't work. And his response was no, it does work. It didn't work the way we just did it. Sometimes it's about sticking with it, or going back and looking at the iterations. Or the implementation to figure out what's the right way to make this work. >> So what does this mean to you? We get that most of you in the pilot you're going to create for this course aren't assembling a large team and going to do a whole school redesign process. If you're there in that setting, in your own school, then, then that's great. Hopefully this material has been very useful. But for most of you, you're probably going to do a mini-version of an implementation of blended learning in your own classroom. And we just hope that these six steps are helpful to drive success and help you get a better result. >> So with all of this in mind, we really want you to experience trying these concepts of blended learning out yourself, which is why we designed this whole course towards doing it in action. And if you can't actually do your final assignment in person with a real group of students, we do have an option for you as well to complete the final assignment. But we can't stress enough how important it is to actually experience this yourself, even if it's just with a couple of students. >> So it's time to roll up our sleeves and actually do the work. So what we're going to do is bring Rob in to talk about the final assignment, and then we'll come back and do a final wrap-up.