Welcome back. I'm delighted to introduce this module, An Introduction to Accessibility and Universal Design, where we're going to start with what do these things actually mean? And then take you through why should we care and what am I going to do about it? Before I jump in, I want to introduce a guest expert who will be joining us for the first half of the module. Phil Kragnes is Manager of the Computer Accommodations Program in the University of Minnesota's Disability Resource Center. And has 18 years of experience in technology accommodation, which for those of you who have been keeping track, is about 18 years more experience than I have. Which is why I brought him in here to make sure that everything that we cover is accurate and meaningful and also, frankly, useful. So we'll have Phil as we go through this introduction and the following four lectures. And then we have other people with expertise that you'll be meeting later on in the module. So let me just start this by jumping into some terminology and some concepts. And I've titled this From Accessibility to Universal Design, but Phil was just reminding me before we started that I probably should start with the concept of accommodation. The first way most people think about dealing with making a computer system available to somebody with a disability, or frankly making anything available to someone with a disability, is some special way of helping them use this because it was not designed to be usable in the first place. And there's lots of examples you could imagine of this. It might be the idea that you say I've just designed an elevator system but low and behold blind users can't read the text that's labeling each of the buttons in the elevator, because it's black type that's lit up and there's nothing raised. Now an accommodation might be a button you can press to talk to an operator who will move it to the right floor. But if you would designed it to be accessible in the first place, perhaps by having raised text and/or Braille labels, or having the buttons in a sequence where it was very clearly anchored to a ground floor and the floors above and below, you might not need to have some special accommodation. When we think about accessibility, there's two ways to think about that. You could think about it as a legal imperative. In the United States, we have the Americans with Disabilities Act for many of the government functions. The Rehabilitation Act of 1973 has a section, section 508 that deals specifically with computer technology accessibility. Different parts of the world have different governing rules, some of them much more expressive about the rights of people to have systems that they can use. Some of them a little bit less detailed, or in some parts of the world not even very well specified. But you can also think about accessibility as a moral imperative. It's just the right thing to do when you're designing something to make it available to all of the people who might be in a position where they would want to use it. You don't create a hotel or a restaurant where there's no way to get in except going up a long flight of stairs, because there are people who can't climb a flight of stairs. And your accommodations should be open to the public, or your app should be open to the public, your software, your bank machine, all of these have advantages in being open to the public. There's a third way though to think about this, and it's where the concept of universal design has come in. And that is that universal design is a frequently a win-win. If you design things right, the accessibility you provide to people that couldn't use it without a proper design also turns out to be beneficial to other users who may never have expected that they had some need in the first place. I'm going to go very briefly through three examples of this in both physical and virtual design. The classic example is curb cuts. In most cities and most of the world, if you look at the sidewalk where you're supposed to cross the street, the curb gets cut down so that you can roll into the street rather than having to step up or step down to get into the street. That wasn't true when I was growing up. That wasn't true 40 and 50 years ago in most of the United States and actually most of the world. Those were put in, in recognition first and foremost, that there were people who needed to move in wheelchairs, who otherwise couldn't get up on to the street or down into the, up on to the sidewalk or down into the street. But if you look at the curb cuts today, they're used immensely widely for things that have nothing to do with mobility limitations, or at least not physical disabilities as we've thought of them in the past. They're used by people pushing strollers to make it easier to push their children across the street. They're used by people with bicycles. They're used by people with skateboards. They're frankly used as a visual marker of this is where you should cross, as a guide to pedestrians. All of that coming as a side benefit of thinking hard about the problem of how do we make it easy to get into and back out of the street. Closed captioning, the text labels that come along with television programs, movies, operas, and even these lectures is another example. You may think of these as something that exists primarily because there are folks who are hard of hearing or deaf. And indeed that's why they exist, but the usefulness of them has gone way beyond. We find that closed captioning is used in bars where people want to watch sporting events, and they want five or six different events on at the same time. And you can't have the noise competing with everything. But by putting a caption up, people can follow what's going on. We see them used in gyms for the same purpose. We see them used by people who are trying to learn a language, by having the running text go alongside the spoken text. It becomes an easier way to use this, and in fact, closed captioning has just become generally valuable in all sorts of circumstances. The last example I'm going to use is mixing auditory with visual alarms. If you've been in a building where the fire alarm has gone off, in most modern buildings today, it's not just a sound. It's a very bright blinking light that is hard to ignore even if you somehow manage to ignore the really loud sound. Combining those two in one form is a way of accommodating people with both lack of vision or either lack of vision, or lack of hearing. But we should recognize that some people are in a circumstance where that, one of those lackings is temporary. So if I have my headphones on, my nice, noise canceling Hard rock playing headphones, I might not hear the alarm but somehow I can avoid seeing the flashing. By the flip side of that, if I was focusing really intently on some sort of visual thing, I might not notice the flashing but I sure can hear the the loud sound. And the result is everybody comes out during the alarm. So, lets talk about the kinds of challenges that different users face that might be opportunities for not only accommodation, but for accessibility through universal design. And, Phil, maybe you can take us through starting with the types of sensory impairments that affect really large numbers of potential users. >> All right, Chase, so far you've hit on visual impairments and auditory impairments which are too pretty limiting factors when you consider society as a whole and when you consider technology. Often we'lll rely on auditory cues to let us know that a computer has encountered an error. Or we have flashing icons or the use of color to indicate some special feature of a webpage. So the production of sound, obviously, a person who is deaf or hard of hearing, maybe completely unaware that any indication is occurring. Similarly of a person with visual impairments maybe unaware that an icon is of a certain color or Is flashing or something of that nature. There are also cognitive impairments that may impact these items, issues such as attention deficit disorder and attention deficit hyperactivity disorder. So you have an interface where something has popped up on the screen suddenly and that person's attention is drawn to that new item. And they lose focus and are unable or have a very difficult time returning to their original occasion. And as we age, although the impact may not be severe enough to cause a blindness or may not be considered a severe enough cognitive deficit, those users are affected as well. And so when we're designing something for use by the general population, we need to consider these individuals and the potential challenges that they face and how our design choices impact the way information will be presented. We often think, okay, well I don't want to use sound because my deaf users will have a difficult or impossible time interpreting those indicators. I don't want to use color or things of this nature because of individuals with visual impairments. But sometimes we can gain the greatest effect by combining those features. So we have auditory but we give people the ability to turn on a feature that flashes the screen, or on handheld device flashes the LED to let them know that an audio event has occurred. That we put in coding of a certain nature that doesn't change the visual appearance, but lets a non-visual user know, hey this area, or this icon, or this block of text is important for this reason. So keeping those factors in mind makes for a more usable design for all users. >> Wonderful, so let's continue down this list. You're talked about ADHD, attention deficit-hyperactivity disorder. You've also had a bunch of experience working with users with different kinds of learning disabilities. I think that's a term that a lot of us have heard, but don't necessarily know what that really means. Could you break that down for us a little bit? What do we mean when we say somebody has a learning disability and how might that affect our ability to design an interface that they could use? >> Well, unfortunately learning disability is often equated with low IQ or information processing difficulties, and that's not the case. There are three factors that are primarily involved with a learning disability. That could be information acquisition, information storage, or information retrieval, many factors can influence this. So when we talk about information acquisition, loading a screen with large blocks of text with many different colors, multiple sections, etc., it can be very difficult for a person with a certain type of learning disability to acquire the information of importance or the information they are seeking. We'll see places where we'll have a list of items that it's item 1, comma, item 2, comma, item 3, comma, item 4, comma, and so on, rather than being presented as a nice, vertical, linear list of bullets. I often think about politicians when we're talking about learning disabilities. Well, they're up in front of this audience, they don't want to look stupid, they want to be able to glance down at their podium and see each item with a bullet point and be able to refer to it, quickly and easily. Well, that's what we all want. And so what benefits those people with information acquisition limitations benefits the acquisition of information for all of us. Storage, RAM or memory, as we as often and think about it, having large blocks of text or let's say you do use bullet points. So the acquisition isn't the problem, but each of your bullet points begins with the same word. And so trying to build a structure where that information can be parsed easily is difficult when you're trying to store based on different words and sentence structure and meaning of the sentence. So use variety, variety's remembered much better. It's kept in storage because each component is given a separate path. Retrieval again, I think both of the factors I've just discussed influence retrieval. Obviously if you have a storage limitation, then retrieval is going to be affected, likewise, getting the information. And to be stored and then retrieved can be difficult if you have an acquisition impairment. Oftentimes we address learning disabilities by providing multi-modal information. So it may be color in combined with text, combined with auditory, combined with bullets. So the more ways we can distinguish information, of course without looking gawdy, then the more likely it is to be remembered. We can also keep in mind very simple psychological principles. There's two particular that can influence memory, and that's what we call primacy and recency effects. So when you have a list of items, the first item or two in a list are remembered much better than items in the middle of the list. That is, they appear first in the list, that's a primacy effect. Likewise, the last one or two items at the end of a list are remembered better, those are recency effects, in other words, the information you read or encountered most recently. So by structuring our layouts, our designs, our tabs, all those kind of things, we can help to benefit all users and especially those with learning disabilities. >> So one of the nice things that's coming out of all of this is there are at least a couple of recurring themes. There's a theme around redundancy, about don't depend on one mode of display if you can display things in multiple ways. And there's a theme around simplicity. Both of which, as folks will see from the other parts that we're teaching them on design, are just generally good design. And that all too often the designs that are difficult for people with various forms of impairment are difficult for them to start with because they were difficult for everyone to begin with. And this is a useful thing. Now as we move to physical and systemic limitations, we may be moving into certain things that are not always going to be as simple as just saying keep it simple or redundancy. What are some examples when we're talking about mobility impairments that might affect somebody using a computer system? >> When we talk about mobility impairments, of course, there's a wide range, as there is with any disability. I mean, the degree of hearing loss, the degree of vision loss, physical can not only be in the degrees, but areas of the body. So the ability to move ones head easily, or the ability of one to use their upper extremities or their hands or fingers. When we're creating interfaces, we need to keep this in mind and not create these little tiny, tiny controls. That let's say someone with a tremor disorder, perhaps they have Parkinson's or some other condition that makes their hands tremor or an intentional tremor. So when someone reaches with an arm or hand, that tremor gets worse. And so if we make a little control that they're supposed to touch with their fingertip and their tremor has the hand or finger moving all about that, the smaller that target, the more difficult it's going to be. Well, when we think about this, making the target less than tiny or an adequate size not only benefits those people with mobility impairment. But how often have you been in a car, hopefully not driving, but trying to use a mobile device, and you're going over a rough road and trying to touch a specific icon or tab. The more severe the road vibration, the more difficult it becomes. Well, it's no different for a person with a mobility impairment. Reach may be another issue, so moving the hand or the finger from one area of the screen maybe towards the bottom. Moving it towards the top or some other extreme location may be very slow and difficult, perhaps even painful for some users. We want to divide our controls and cluster them in a way that is clear. But we don't want their location to be so disparate that we're required to touch one control at the top and make a choice at the bottom. So keeping the mind that mobility may be reach or movement related. It may be speed related, tremor, accuracy related. And so, when we're thinking about controls, where to put those controls and how to cluster them can all influence how easily that interface can be used by someone with a mobility impairment. And those individuals who find themselves in situations like a moving car that produce their own tremor or movement issues. >> And then there are other chronic health conditions that could affect somebody's ability to use a technological solution. I know in my experience I've run up against one of them which was a case of a user with epilepsy. And flashing on a screen turned out to be a trigger that could cause that person harm and discomfort and sort of temporarily disable the person. Are there other examples of this that we should be aware of? >> Well yeah, photosensitive epilepsy is one that's actually addressed by the world wide web consortium in their standards. And there are certain screen areas or amount of screen area and flashing rates that can trigger photosensitive epilepsy. There are many other disorders, diabetes is one that can again mimic a learning disability. In the case of someone with a low blood sugar, the ability to acquire large amounts of information or blocks of information can be very, very difficult. So those things we again do for people learning disabilities, breaking things into bulleted lists. Starting each of those bulleted words or phrases making them different from one another, benefit people with diabetes and other related disabilities. Lupus or chronic fatigue syndrome, again we go back to what was just said about mobility impairments, having to frequently move Around the screen, moving one's finger or hand from one area of the screen to the opposite side or end of the screen can be very taxing for these individuals. So, knowing that when I touch or someone touches a control, that if it's going to pop up associated, related controls, they're going to be within a short distance, proximal distance of the activating control rather than putting an activating control at the top of the screen and the displayed choices at the bottom or elsewhere on the screen affects again, people with mobility impairments and also certain types of systemic disabilities. >> So, we are going to go into each of those in more detail. Later in this Module, you will also hear from people with expertise both among our faculty and guests and talking about older adults and some of the physical and cognitive changes and even technology comfort issues that have to be addressed. Looking at interfaces for children and you get into issues of not only literacy and vocabulary, but physical manipulation, particularly with younger children and they're ability to take direct paths for instance from a source to a destination with a pointer. And socioeconomic differences where you have vast differences in the way people use technology and all of the other backgrounds as you look at people with different levels of both resources and education and the environment in which they live. Last thought for this introduction. Our goals in universal design are about identifying designs that increase accessibility while improving the usage experience for all users. Sometimes this requires making the universal design ubiquitous. So, making it clear that you don't have a curb cut that is electronic and the curb goes down only when a wheelchair reaches it, everybody should have access to that curb cut. But sometimes this requires standard ways to turn universal design features on and off. Some people will find the text in a closed caption to be distracting and others will find it helpful and having an easy way to turn that on and off can be remarkably useful and there's other similar examples as we were talking about motor issues. Some of you may have run into features like sticky keys by mistake. You held down the shift key on a Windows machine for long enough, it said hey, we can turn this on for you. That's a feature, if you're not familiar with it where instead of having to hold down the shift key while you type a letter to get a capital version of that letter and similar combinations, you can do it sequentially and say shift, f and you get a capital F. That type of feature has been available in computers for a long time, but if it were turned on all the time, it would make some of the standard typing much slower because we actually benefit from being able to do multifingered sequences. And so, having a standard way to turn it on, turn it off is the better experience for everyone even if sometimes you may want to turn it on because you're trying to deal with things with one hand while you're holding something else in your other hand. >> And I think you could, aside to that is, it's important in design not to override system settings and controls. So, if you disable the ability to hold down the shift key and get the sticky key option, then those users who require that are going to be in trouble. Maybe someone uses high contrast mode built into their operating system. But if your design has overridden that ability again, we have to be careful that if it's not an accessibility feature that we're building in, that we don't inadvertently override those that are built into the system. >> Excellent point. And I think the last point we're going to have here is that, some of these basic accessibility guidelines are either part of commercial toolkits and the commercial tools that we're building with or usability standards for different platforms, whether that's the World Wide Web standards, or Android, or iOS, or all of the others. Being aware of when that takes care of your usability and accessibility challenge and when you need to do something else is an important part of understanding this design. So, this has been our introduction to the Module on Universal Design. This Module has a series of lectures followed by a quiz at the end And we'll be back to start talking about each of the individual special populations and special needs in sequence.