We want to open up ethics discussion with some recent research from the psychology community that I think provides an important foundation for us. The psychologist have talked about behavioral across a wide range of domain for decades but only recently start thinking about behavioral ethics. We've given you an interesting article by Banaji at all out of Harvard Business Review. They title it provocatively, how unethical are you, and they raise some interesting questions. I think their motivation is to raise awareness, and we want to have that level of awareness. So I want to start with this article. They begin the article by asking the question, answer true or false. I am an ethical decision maker. And I think most of us will be inclined, almost in a knee jerk way, to answer true, I am an ethical decision maker. And they want to challenge us on that. And they don't want to challenge us because they think we're bad people. They want to challenge us because they know from a number of psychological studies now, that we have biases that will lead us to be less ethical than we think we are. And so, let's just have a real quick summary of what they find and argue. They basically say neither strong convictions about being ethical or good intentions to be ethical are enough to ensure being ethical. And in particular, the problems arise because one, we show implicit prejudices. We have these associations across social categories. So for example, we might have associate good or bad with some demographics, with some social categories then with others. This has been studied now with the Implicit Association Test, by millions of people. And there is some debate about the meaning of these scores, but what seems to be, where there seems to be some consensus is the strength of the association that many people have. That lead us to have these implicit attitudes beyond our awareness often, good or bad, with different social categories that can affect decision making, that can affect beliefs. That is one of the reasons they're concerned about intentions and convictions not being enough. A second is in group favoritism. We're inclined to treat those that are in our in group differently to reason about them differently and it leads us to be biased and in some cases less ethical. Some people have argued recently that some of the biggest challenges we have that get coded as stereotypes or racism really come down to in group favoritism this in group, out group distinction. And the challenge is that we are so hard wired to favor our in groups. Some people make evolutionary explanations. Whatever the explanation is, it does seem to be one of the root causes we tend to reason more favorably about those who are in our in group, however that is defined. It might be a family, it might be an organization, it might be a function, like engineering. We tend to reason in a way that's favorable towards those and less favorable towards those that are in the out group, our out group. Third reason is that we over claim credit. We are self serving in what we believe we have done and contributed. This has been found in studies of all kinds. One of the most famous ones is in marriages. They'd go in and say okay what percentage of these household tasks do you do. And they ask that to both parties and of course, this has to add to [LAUGH] 100%. They're surveying everybody who's responsible for it. But what they find is something like 85% of these tasks add to more than 100%. People are over claiming credit for what they actually do. They find this in teams at work. They find this among co authors of academic papers. We seem to be hard wired or biased in a way. Part of it's just we know more about what we do than what others do. It ends up leading us to over claim credit. Again, not because of poor convictions or not because of a desire to be unethical. It's just the consequence of our perspective and our self serving biases. Finally, we resolve conflicts of interest in a way that favor us. Many of us believe that we can manage these conflicts of interest we might have in a way that's neutral and that. They give the example in the article of doctors who. The big, big changes in the medical industry in recent years because, eventually, regulators and even doctors themselves, realized that some companies. If they're trying to persuade you to prescribe their drugs, actually have an impact even if doctors don't want them to have an impact, even if doctors believe that they are neutral and uninfluenced by the tactics that companies use. And this has led them to basically prescribe any influence in that industry whatsoever. That's almost what's necessary because it's impossible for us to resolve perfectly objectively, the conflicts of interest that we have. Again, despite intentions, despite convictions that we want to do so. So their summary is that we are more biased than we think we are. And I am usually sympathetic to it because it is so related to all the other research and psychology about our biases. And it's a very important starting place, we are going to be more open to making changes. We are going to be more open to more honestly making the trade offs that are necessary when you're exerting influence. If we're honest with ourselves about our biased starting place. So why don't you start there?