So far we've talked about how to reduce reactants and ease endowment. Now we'll talk about the third key barrier we have to face and that is distance, essentially too far from people's backyard and they tend to disregard. To say that current political climate in the United States is divisive would be an understatement. More than half of Democrats and Republicans have very unfavorable feelings towards the opposing party and more than three times the number of the mid 1990's. Neighbors tear-down yard signs, opposing perspectives are shunned, and many Thanksgiving dinner served reminders not to discuss politics. So called filter bubbles are a common explanation for the discord. Birds of a feather flock together and people have always preferred media outlets that support their existing views, but technologies exacerbated these tendencies. Rather than talking to neighbors or flipping open local papers, people get their news and information online and the online ecosystems increasingly tailored to one's existing views towards people who already agree with you. So the web and social media combined to create a state of intellectual isolationism, where people are rarely exposed to conflicting viewpoints. Combine this with people's penchant for click on information that supports their perspectives and algorithms can lead humanity become more and more isolated. To solve this problem, pundits often suggest a basic solution. Just reach across the aisle, rather than holding up inside once online bubble, talk to someone who sees things differently, create bridges towards the other side. Intuitively, this makes lot of sense. By moving beyond caricatures and stereotypes and engaging with someone who disagrees, both sides will benefit. By understanding where the opposition is coming from, we'll all gain more nuanced views. But as reaching across the aisle actually work? Is it actually effective? One sociologists hopeful, Chris Bail from Duke University, thought if you could just get people to consider the other side, they'd come around. The exposure opposing viewpoints which shift people toward the middle, maybe not a lot, but some. Liberals and conservatives, they wouldn't sing Kumbaya but they'd at least move slightly towards the other party. So to test this possibility, Bail set up a very clever experiment. He recruited more than 1000 Twitter users and had them follow accounts that expose them to opposing viewpoints. For a month, they saw messages and information from elected officials, organizations, and opinion leaders from the other side. A liberal might see a tweet from Fox News or Donald Trump, conservative might see posts from Hillary Clinton or Planned Parenthood. It was a digital version of reaching across the aisle, a simple intervention that can have big effects for social policy. Then at the end of that month, Bail and his team measured users attitudes, how they felt about different political issues and social issues, things like whether the government regulation is beneficial, whether homosexuality should be accepted by society, or the best way to ensure pieces through military strength. It was a huge undertaking, years of preparation, thousands of hours work. The hope was as we discussed that thousands of pundits and columnists and other talking heads would be right. Connecting with the other side would bring people close together, but that's not what happened. Expose the underside didn't make people more moderate, in fact, just the opposite. Exposing people to opposing views did change minds, but in the opposite direction. Rather than becoming more liberal, Republicans exposed to liberal information became more conservative, developing more extreme attitudes. Liberal showed similar effects. Be one thing, if the tweeted tried to persuade. As we discussed previously in the section on reactants, persuasive attempts often induce that pushback. But in this instance, no one is trying to persuade, rather than telling people do one thing or another, most of the post just contained information. So why didn't information by itself help? Well, this comes back to us. When we try to change minds, we hope that evidence will work. Giving people facts, figures, and other information will encourage them to move in our direction. This intuition is simple, data should lead people to update their thinking. They should consider the evidence, shift their opinions accordingly. Unfortunately, though that doesn't always happen. Take false information, whether looking medicine, politics, or various other domains, research finds that exposure the truth doesn't always work. Sometimes it makes people more likely to believe the truth, but other times and often, it just reaffirms falsehoods. Even though there's little intend to persuade and should be little reactants, people still discount the information. Because rather than changing false beliefs, exposure the truth often increases misperceptions, it leads people to be more likely to believe the exact opposite. So one question then is, when does information work and when does it backfire?