[MUSIC] Our final set of illusions of knowing occur with what we might call the illusions of the outsourced mind. And this arises from a bias that's known as the individualism bias, which leads one to think that individual intellectual competence is much greater than it really is. Perhaps this is motivationally important for young children to not realize how incredibly dependent they are on other minds for getting around in the world. But even in adults, there is the tendency to assume that one knows a great deal more in one's own mind than one really does. And one tends to overlook the facility and ease at getting information indirectly. In other words, one often relies on experts or external sources for understanding how the world works, but one may often overlook that one's doing that and think it's all in one's head. This occurs often in transactive memory, where you have couples, who might've been married for 40 years, who fill in each other's knowledge gaps all the time, and the other member or the partner doesn't realize that this is happening. Perhaps when they go shopping, one person gets the vegetables, another one gets the meats, and they just assume they have that compass themself to evaluate those pieces of merchandise when, in fact, they're relying heavily on the other person. You also see it in underestimating external knowledge information sources. And we'll see that in a second. So I want to give you two examples of how we get illusions of the outsourced mind. One has to do with understanding word meanings, and the other has to do with Internet search. The misplaced meaning line of work relies heavily on an idea developed long ago by the philosopher Hilary Putnam called the idea of the division of linguistic labor. And this is the idea that we often know the meanings of words through hunches about who the relevant experts are. You think you know the meaning of gold, but in fact, you may not be able to tell the difference between fools gold and real gold, except by deferring to an expert. So you know the meaning of gold, but only through a chain of deference to another mind. You don't really know it directly. You dont' know the atomic properties to distinguish gold from impostors. Now, in a study that we did to try to test this, we gave people large lists of pairs of words. And there were three kinds of pairs. There were known similars, unknown similars, and pure synonyms. The known similars were things like wolf-dog, rowboat-canoe, seal and walrus. The unknown similars were things like ferret-weasel, pine-fir, and blackbird-starling. And the pure synonyms were things like sofa-couch, car-automobile, and baby-infant. We'd give them a long list of these words, all mixed together, and have them quickly rate how many features they knew they could tell apart in them. The idea here was to look at their gut impressions of how much they understood the differences and the meanings of all these pairs. And we did it quickly so they couldn't self test and actually assess them. And what you find is the following. The left holds most panels, the number of features they thought they could list for something like rowboat and canoe. The next panel, the middle panel, is how many thought they could list for things like ferret and weasel. And the final panel is how many thought they could list for true synonyms, like baby and infant. What you see is they all thought they could list far few features for the true synonyms, which is, of course, accurate. We then asked them to actually list the number of features that they knew between them. The mistake they made was in thinking that they could list just as many features for the unknown similars, like weasel and ferret, as the known similars. When in fact, when we asked them to list the features, they were no better for the unknown similars than they were for the true synonyms. In other words, they know no features that distinguish weasel from ferret. But because they knew about the presence of experts out there in the world who did know the difference, and they knew that they probably knew several features, they displaced that knowledge into their own minds. There's a much more modern case of this that we've recently uncovered that has to do with illusions of understanding, that are induced by engaging in Internet search. One of the things about Internet search is that when one looks at stuff on a search engine, such as Google, one has instant access to unbounded information, and that instant and easy availability of comprehensive explanations and information may lead one to think that one knows stuff that one can only get through search itself. In other words, the boundary between what's in one's mind and what's out there in the world can get blurred and one starts to think stuff that one is searching for is stuff that one actually knows. You can intuitively get some sense that this is true when you think about what happens when you become unplugged from the Internet. Sometimes, for reasons that you didn't want to, you were perhaps on a vacation and the hotel you went to doesn't have Internet service or it fails, you can, over several days, and you don't have good cellphone service, have no access to the Internet at all. And one might start to realize that one didn't understand things as well as one did, and some knowledge one thought one had, one didn't really have. Perhaps one thought one knew about the plots of certain movies one was going to see, or one understood something about the weather systems where one were, or the history of the location where one's traveling. And all that knowledge starts to shrink down and one feels rather pathetic. In my own life, I've experienced this a few times when I was isolated by a hurricane or some other natural weather event, and I felt my knowledge getting shrinking almost on a daily basis as I realized how much I was dependent or parasitic off the Internet for insight. So what we did is we did a series of studies to illustrate this point. We would have one group of participants who had unlimited access to the Internet to answer questions about how things work, like how does a zipper work. Other groups would answer this question without access to the Internet. Sometimes, just off the seat of their pants, sometimes by reading it from a printed print-out that was the same as what they find on the Internet, but not one which they actually searched for. Then the groups were asked to rate their ability to answer questions on a seven point scale. And then the second phase, they all rated their own ability and the extra ability to answer questions to six domains, unrelated to the question posted in the first phase. So, I ask you to how well you understand how a zipper works, you look it up on the Internet or you don't, you do this for a bunch of items, and now I ask you about items completely unrelated to those you've searched for, say how a flushed toilet works, and how well you think you understand those. What we find is that, regardless of how we construct the task in many, many different conditions, the Internet group rates higher in many kinds of knowledge. So whether it's knowledge about weather, knowledge about science, knowledge about history, knowledge about food, knowledge about the human body, or knowledge about health, people who've just engaged in Internet search for items unrelated to the ones that are now being asked, nonetheless inflate their understanding. It's as if they think 'Hh my goodness, I really understand the world well' and this inflates their knowledge for things that they haven't searched for, but they still have this sense of access to the Internet that makes them think they know things better in their own head. One question you can ask is, do they really think it's in their own head, and we've done a variety of follow up stages to show that they really do. One particularly vivid example can be illustrated by physically instantiate what it means to be in the head. To do this, we engaged in what we might think was a little neuro-fiction. We told people that when you understand something better, your brain lights up more. There's more brain activity. So we showed them these fabrications. These are not real neuro images, these are basically colored in red images. And we said that when you don't understand something very well, the left-most panel, you have just a little bit of red glowing. When you understand it better, you have more activation of the brain, all the way up to the far right, which would be level seven of understanding. Again, I caution you that this is not real data, this is just neuro-fiction. But the vast majority of our participants believed it and thought this was a way of measuring how much you know. We said, okay, now they've just engaged in Internet search, or they just engaged in answering the question without Internet search, and we asked them how well do you think you understand this as measured by your brain activation. Slide this cursor on the screen along the bottom until you think it corresponds to the level at which your brain would be activated when understanding this. So, how much knowledge do you have in your own head is indicated by brain activation. And what you find is they slide it further along, they give a higher rating of their own brain activation levels, after just having engaged in Internet search. So regardless of how we measure it, there's a strong effect. People tend to see intellectual individual autonomy as far greater than it is. They think they're often lone wolves, investigating the world on their own, they figure out things on their own, and they grossly overlook their reliance on external experts, through deference, external sources, and the like. And this seems to be aggravated by having very easy access to instant information. 50 years ago, it might have been having an encyclopedia in the basement, but today, it's having the Internet at your fingertips. All right, let me conclude. There are many sources of intellectual arrogance, intellectual humility. Many different cognitive biases and factors that can contribute to them. We focus mostly on heuristics and biases, as opposed to the social motivational factors. And we focused especially on those that consistently produce intellectual arrogance, as opposed to intellectual humility because those seem to be much more common. In many cases, it is driven by knowledge structure, in content. Knowing something tends one to make these misleading inferences and attributions about what one else knows. And sometimes, these can interact with other biases as well. One of the challenges is that the plurality of causes of intellectual arrogance may make limiting it much more difficult. And so, one of the great research programs for the future is to figure out ways that one can reduce intellectual arrogance without simply making people feel terrible about what they know. Thank you.