Hey folks, in the last video, Lauren talked to you about this famous notion of a user interface Hall of Fame and a user interface Hall of Shame. In the next couple of videos what we'll be doing is walking you through a couple of user interface case studies. Some of which fall on the Hall of Fame side, and others of which fall on the Hall of Shame side. I get the honor of starting out with a couple of case studies that are definitely on the Hall of Shame side. In fact, they are major user interface disasters. And the key takeaway from this video here, is that design really matters. Not just because it'll help your company make money, or it will make your users happier. But also because it can save lives, it can prevent injuries, and it can prevent other major types of disasters. So the first small case study I'm going to be talking about is this one here. As I mentioned earlier, in addition to our doing a lot of work in human computer interaction, also do a lot of research in spatial computing. And this case study definitely falls at the intersection of those two topics. In this case, this is a Washington Post news story describing what happened. An elderly couple was going to meet their daughter in Brazil, and they typed in the address of where they were meeting their daughter. And it turns out the address of where they're meeting their daughter, that street name, there are multiple streets with that same name. They ended up going to the wrong one, and this one happen to be in a very dangerous area. And unfortunately, one of the two people, and the couple was killed. Now, oftentimes, this type of disaster is attributed just to the user, right? They didn't notice it was the wrong place. How could they be so stupid, right? But the design view is that we need to find ways to prevent people from making these types of mistakes. Every human makes mistakes. How can we prevent them from doing so? And in fact one of the key principles we'll be learning later in the course is called heuristic evaluation. This is just one of the frameworks that we can use to analyze what happened amidst this disaster. But it's a very effective one and it's also one that's very straight forward. Heuristic evaluation outlines ten or around ten depending on who you ask, different heuristics that we can use to evaluate our interfaces. And heuristic number 5 is about error prevention. And let me read it to you here. So heuristic number 5. Even better than good error messages is a careful design that prevents a problem from occurring in the first place. Either eliminate error prone conditions or check for them and present users with a confirmation option before they commit to the action. So, let's unpack this with respect to what happened to this Brazilian couple. Well, can we eliminate error prone conditions? Its unlikely that will changes all street names around the world so that each one of them is unique. Right here in United State there are probably, maybe tens of thousands if not hundreds of thousands of streets named Washington, streets named Adams, these types of things. In many other countries, it's the same way. It's unlikely that we'll be able to give every place a unique identifier. Algorithms that can help you figure out which Washington street you want, which Adams street you want. These are getting better. And in fact my students and I work on this problem. But again, it's unlikely that these are going to be perfect any time in the near future. So the good news is the heuristic number 5 also tells us what to do. And this says we should check for these error prone conditions and present users with a confirmation option before they commit to an action. So how might this look in the case of this Brazil situation? Well, they were using their Waze app, so you can imagine Waze adding a warning. When you start riding yourself to an area, for instance that has a government advisory, right? It says, if you don't live there, maybe you shouldn't go, so it says, warning, are you sure you want to go here? There is a government advisory against non-residents visiting this area. You have to tap I meant somewhere else, or I understand the risks and I want to go here. All right, so this probably would have saved that poor person's life, design can save lives. Now another way that design can save lives is this very famous case study in user interface design about the Therac-25 radiation dosage machine. Effectively what happened here was, the machine was designed in a series of terrible ways, one of which was its user interface. And ultimately, six people either were seriously injured or actually died because of this machine. They were going into the hospital to get healed, and due in part to poor user interface design, they instead were harmed. There were a series of problems with user interface design and again, with other parts of the system too, but some key lessons that came out of this was again, trying to find ways to avoid errors. This system allowed a technician to deliver way more radiation to a patient than anyone would ever need. Right, so had they built into the system a threshold thread, they would have been very easy to prevent those six major accidents. Another problem was system visibility. We'll be talking a lot about system visibility later in the course. The technician could not see how much radiation or could not see very easily how much radiation they had delivered to the patient. And that meant, that's just extremely dangerous in that context. So if you can't see that you've already done a full dose, it's easy to say I don't know. I saw this light, this error message. Did the full dose get administered? It seems like it didn't, I'm going to give another dose. Speaking of which, very famously there was a terrible error message in this machine that said no dose had been delivered when in fact a dose had been delivered. A very deadly mistake, even though it's a small bug in code. There was terrible documentation. And then critically, this speaks to a lot of what we'll be discussing in the course. They failed to involve users in the design and testing of this machine, right? A lot of these problems could've been sussed out very easily, had they engaged in some of the practices that you'll be learning in this course. Okay, the next disaster is one that's very recent. This happened, as of this recording in late June, about a week ago. I know probably many of you, like me, are Star Trek fans, so we were very sad to learn that the new guy who plays Chekov, Anton Yelchin, died due to what was called a freak accident with his car. And it turns out that this actually isn't a freak accident, right. It was a design error. And what had happened was, he had bought a car that had a new type of gear shift. A car by Chrysler. And this gear shift didn't operate like typical gear shifts, right. A typical gear shift, you shift it up. The gear shift stays there. But in this one, it's more of a toggle. You shift it up, it goes, the little P or the R or the N lights up, but it goes back, so it goes back, it goes back, and it turns out, this was a well known problem. The US Government had actually recalled a lot of Chrysler cars because of this. It confused people. They were used to seeing the state of the system very visible In the location of the gear shift. But in this car the location is always in the middle. And this confused people into thinking that their car was in park, when in fact it was in reverse or neutral. And that's actually what ended up killing Yelchin. Another example here right of how a poor design can lead to tremendously sad events. So again here the key design lessons are about system visibility we'll be learning a lot about system visibility. You always want to make sure the state of your system is very, very visible to users. This matter is for minor annoyances like figuring out whether or not you have caps lock on, and it also matters for potentially saving your life from your car running over you. Another design lesson that comes out of this incident is standards. So if you're going to change a standard that's incredibly important and incredibly common, like how a gear shift works, you have to be very, very careful. Chrysler has some great engineers. They probably thought they were very, very careful, but violating a standard, changing a standard, that's always something that's difficult to do. We'll talk more about the importance of standards later in this course. And then finally, had they just used the best practices in user-centered design, like you're going to learn in this course, this problem probably would have shown up, right? Had they brought enough people in and done withuser evaluation with these people, this probably would have shown up. They would have realized, hey, some of these users are having a hard time understanding what gear their car is in. Maybe you know we shouldn't do anything major with this gear shift, let's just use the standard one that we've been using for decades. Okay, so a lot of sad stuff in this lecture. I wanted to close with a UI disaster that wasn't quite so, a sad or the impact wasn't quite so large. Earlier this year an American made a very similar mistake to the Brazilian we talked about earlier. Instead of the same name though, this person just typed in one wrong letter or slightly misspelled where they were going when they were visiting Iceland. And they ended up traveling from Reykjavik, which in the southwest part of the island, all the way to the far north of the island, something like a seven-hour journey. When they left the airport, meant to go to their hotel in Reykjavík 30, 40, 50 minutes away, they ended up going seven hours away. And word got out and this actually was very famous in Iceland, the Icelandic news came and interviewed this guy. He was in all sorts of newspapers and these types of things. He ended up getting all sorts of Icelandic delicacies from folks who wanted to have him over at their restaurant and whatnot. So I guess even though some of these disasters can lead to very bad things, occasionally can also lead to some humor. All right, with that I will turn things over to our next case study, and I'll be seeing you soon. [SOUND]