Yeah, hi folks this is Mat again. I am going to tell you a little about the Shapley Value now which is one of the most prominent ways of dividing up the value of a society, the productive value of some, set of individuals among its members. And, you know, the, the, the basic idea in coalition or cooperative games in terms of, trying to figure out, allocating values is having some notion of what the right way to do things is. And, and we might say, even in quotes say, what's the fair way of, of a coalition to divide up it's payoff? it's obviously going to depend on the way that we define fairness and it the literature has basically then, taken ways taken axioms as the primary way of expressing the properties of what the, the desired properties are of, of rules for dividing up things. So, so what we're going to do is then have some set of axioms or properties that we want to satisfy and then see what that gives us. the Shapley Value is, is based on Lloyd Shapley's idea that members should basically be receiving things which are proportional to their marginal contributions. Okay? So, so basically we look at what, what does a person add when we add them to a group. and they should be getting something that reflects their added value to the society. Okay, so what's the, the tricky part about this? And let's just take a quick example, and that'll give us an idea of why we have to be careful in doing this. So let's suppose that the, everybody in, together in a society can generate 1. But that if we're missing any member of society we get 0. So this is say, of a committee and the committee all has to be present in order for, for it to do anything. So if it's missing any of its members it can't just make it decide, decide on anything. so in this situation, what do we th, so we've got, you know, v of n is equal to 1, v of s is 0. If, if we're looking at any s that's smaller than n. N. so in, in this situation what's true, then the marginal contribution if we take any individual out of this group, thei r marginal contribution is 1, right. So everybody is essential for generating this 1. So everybody's marginal contribution to the, the co. the society without them is 1. And in this situation, we can't pay everybody what they're responsible for in terms of, of leading, ultimately to the, to the grand coalition. So, we're going to have to think about some way of weighting contributions in order to, come up with a, a reasonable thing. And obviously, for this particular rule, it would be reasonable to. To, to add up things by 1 over n, so everybody gets 1/nth of the contribution. but in rules where, in situations where there just might be some asymmetric, asymmetries in terms of who contributes which value, we're going to have to think very carefully about how this should be weighted. Okay. So, Shapley's axioms are going to give us a handle on this. So let's take a look at those. so the first idea is a very simple one, and one which pretty much any rule that you would, would think of in these settings is, is going to satisfy. So if we think of 2 different members of a society. Say, i and j. if they contribute the same thing to every possible coalition in which they could be a member. They're completely interchangeable. So that, if we're looking at some, coalition that has neither i nor j in it. If we add i to that coalition, we get exactly the same value as we get when we add j to that coalition. If they're interchangeable, then they should be getting out the same allocation of value. So if Si is the way that we're dividing up the value from some coalitional game, then we should be giving the same thing to i as j when they're completely interchangeable, okay? This is a fairly uncontroversial axiom. it, it, it really captures a basic notion of fairness. That if it's, you know if individuals are completely equivalent they should get equivalent payments. Okay, next axiom. dummy players. So I'm sure that, that everyone has had some. experiences with people like this whats the idea there is a s ituation where you add a person eye to a coalition and they add absolutely nothing so no matter what 'S' we look at if we add 'I' to a particular 'S' we get the same value as, as the situation with without that individual. So basically the person's completely worthless no matter what coalition we're looking at. so the, the idea is, then the axiom is. if an individual is a dummy player then we give them nothing now this is on 1 hand its a fairly reasonable axeum if someone is contributing absolutely nothing then there is no reason they should get anything on the other hand this depends very much on the perspective You're taking. So, if we're thinking about a society. It could be that, that I contributes nothing, because of reasons beyond I's control. So if something happened. They had a, an accident. Or, for, for some particular reason, they, aren't able to function. society might still want to allocate something to those individuals. So it really depends on what the time perspective is. Whether we're thinking about social insurance. and so forth. But, but nonetheless, it's a fairly intuitive axiom, and it's going to be a fairly powerful one in, in, in what it delivers. Next one is additivity. this one is one which you might think of more about the process of allocating value. So let's suppose that we can think about looking at a. Cooperative game or coalitional game and we can, it, it's one that separates very nicely into 2 different parts. So we can think of it as, as you've got one cooperative game, you've got another one and we think of what do you get when you sum these two things together. And the idea is that if we're looking at two different coa, cooperative games. And then we think about what would happen if you, were trying to allocate something. When you summed them up, you should get the same thing from allocating one, allocating according to the second, and then adding those two things up. Okay? So the idea here is if, if we're looking at a cooperative game where the value of any coalition is just what it gets under the first game, plus what it gets under the second game. Then the way that we allocate values should be how we allocated things under the first game plus how we allocated things under the second game. So, you know? This is, fairly obvious in terms of what it means mathematically. In terms of, of how you interpret this, and what the story is for why you might desire to have to satisfy an axiom like this, that's a little harder. You could think of this as a story for saying, you know, maybe society one day produces according to v1, and the next day according to v2 And if what it produces the second day doesn't depend on what it produced the first day, then we should, we should be able to allocate the, the fruits of the production in the first day, and then allocate again on the second day. And, and those things since they don't interact at all. We should be able to do that separately and what an individual gets is just the sum of those two things. So you can think of a fairly logical story for this kind of axiom. Okay so what do we get from these three axioms? the Shapley value, and let's have a look at exactly how you define the Shapley value. So the value, the, the Shapley Value is going to be marginal calculations. What does an individual i add to coalitions that don't have i when we add. So we've got coalition with i in it, coalition without i. We then take a, a peek at how much generates and then what we're going to be doing is weighting that by different possible ways in which we could've come up with this marginal calculation. And then dividing through by all the possible ways that, we could have done this. Okay? So we'll make sure we average over all these things. so that everything sums up to the full value. Okay. That's the, the Shapley value. We're going to dissect this in, in more detail in a moment. and what's the theorem? The theorem is that if we look at a coalitional game or cooperative game, there is a unique way, that divides t he full payoff of the grand coalition so if we're making sure we divide everything up that satisfies symmetry, dummy and additivity. So if we put those three axioms together there is only one way to do it. And that way is the Shapley Value. So there is a unique way which does satisfy these and it's the Shapley Value. So that's a pretty powerful theorem there is a fairly elegant proof to this. it's fairly intuitive. We're not going to go through that in detail but we'll go through some explanations of this. You can find the proof fairly easily in a number of different places. there's actually a nice book by Osborne and Rubinstein which is free on-line which has a proof of this. But there's a, a number of places where you can find this. Okay. Let's have a peek at, the actual value, in terms of what, how this breaks down. And then we'll look at some examples. So, what in-, individualized giving is, according to this formula, looks a little daunting. But the intuitions are fairly simple. So let's think, we're thinking of marginal contributions. How are they coming. about so what we going to do is we going to think of all the different possible ways we can build society up so for instance we could be building society up by first adding person 1 then adding person say 3 Then adding person 2 right , so we that would be one order in which we could build a society up. we could also have built it up, up by first adding person 2, then adding person 3. then adding person 1 right? So there's a whole series of different ways if we had a 3 person society that we can go about building these things up. And in each according to each of these orders, will have different setting, different marginal contributions along the way. So here, first person one contributes something. Then person three adds their production. Then person two adds their production and so forth. So we end up with these. these different contributions and that's what this is going to capture. So what we're doing is we're looking at the se different sequences. And the first part we're doing is, is calculating as we went along the sequence what did i add when they were added? next we weight this by the different possible ways that we could've built up the coalitions before i was added. Then we also weight this by the different orders different ways we could add the individuals who haven't been added yet, after i has been added, right. So there's a number of a, individuals minus the number who are already in S minus i, So that's the number of, of ways the people that are still left. fac, take that to the factorial gives us how many different orders we can still add people in. So we weight it by that. And then we're summing over all possible combinations coalitions that are there before i. And then we're dividing through by the total number of different orderings we could have over people in this society. Okay? So that's the Shapley Value. and in terms of understanding this, again what we can think of in terms of the, the ways in which we divide up our society, we can think, you know, adding person 1 first, then 1,2. ,, , 1, 2, 3. We could have added 1 first, then 3. 1, 2, 3. We could have added, 2 first. Then 1, then 3. 2 first, then 3. 3 first. Then 1, 3 first. Then 2, 1, 2, 3, and so forth. So, we could have done this in a whole series, different orders. So there's 6 of these. Right? 6 different orders. And, so, for instance, if we want to add, figure out what person 1 adds when we add them. In the first case, this is v1. Second case. This is v1. Third case, what are they adding? They're adding v12 minus v of 2 that was already there. third case, they're getting v of 1, 2, 3 minus v of 23. that's the fourth case. The fifth case, we're getting v of 1,3 minus v of 3, and so forth. So we've got here v of 1, 2, 3 minus v of 2, 3. Okay. That's the Shapley Value. Each one of these things is getting weighted by a sixth. Here. This turns out then to get a total weight of 1 3rd. again, here we going to get 1 3rd. And then these 2 are each getting a weight of 1 6th each. Right?, so that gives us the total value of the Shapley Value, and that tells us what person 1 should be getting in this setting. you know lets take a look at our simple simpler example just with 2 individuals and try to figure out what exactley the Shapley Value gets. So these are two people. They form a partnership. So person one alone was generating a production of 1. Person two was generating a production of 2. They say, wow, let's get together and form a partnership. We can do better than we can separately. they generate a total value of 4, so this is nicely super additive. We're getting a higher value when we got the two together. and now they, they sort of, at the end, they try and say, okay, well how should we divide the 4 among, among them. Well, in this case, we could have added 1 first, and then 1, come up with 1 2. the other possibility is we get 2 first. And then 1, 2, right? So there's only 2 different ways we could have built society up. So person 1 in the first, if we're trying to figure out what to give person 1 out of this, here they would get V1, right, which is 1. Here they would get V12 minus V of 2, the marginal contribution they added if they were added second. This is the marginal contribution if they were added first. what's this value? This value is, 2. And, each one of these gets a weight, ultimately, of one half. because we've got two of these things. So we're adding a half of 1, a half of 2. We get 1.5 is equal to phi of 1. you can go through, you can check that 2.5, then, is going to be equal to phi of 2. So here, what do we end up with? The Shapley Value gives us that if these are the, the contributions that people were making you're going to end up with 1.5 as the right amount to give to person one, and 2.5. to give to person 2. ok. So they are each in this case getting some value that depends on the so its taking into account what these values are and trying to divide the 4, so they don't just say o k lets split the 4, 50-50. they're, they're doing a different calculation than that. And it comes out at 1.5 and 2.5 in this case. Okay. So what about the Shapley Value? It allocates the value of a group according to marginal calculations. it's captured by some very simple logic and axioms, and what you can do is you could think of other axioms you could think of other ways other fairness ideas or other kinds of things you desire your rule to satisfy and that's going to come up and make different kinds of predictions and we'll take a look at the core next which is another uses a different kind of logic than the shapley value for making predictions about how a society should divide up its values.