0:05

Alright to just work through another example here,

Â let's take a particular case of people being audited by the IRS.

Â So in this particular example, lets suppose we've got an accountant.

Â He's got three clients who filed returns of more than $5 million for their estate.

Â So there is a 50% chance for each of these clients, that they're going to be audited.

Â Right, well if we look at what are the chances that

Â all three of them are going to be audited?

Â So client one and client two and client three all audited.

Â All right, well, 50% chance for client one, and a 50% chance for client two.

Â So we're going to multiply those together.

Â And a 50% chance for client three turns out that that joint probability of

Â all three being audited is going to be 12.5%.

Â All right, what about none of them being audited?

Â Well, client one is not audited, client two is not audited, and

Â client three is not audited.

Â It's actually going to turn out to be the same because we're dealing with a 50%

Â probability here.

Â But it's the probability that client one's not audited,

Â multiplied by the probability client two is not audited,

Â multiplied by the probability of client three not being audited.

Â And then how likely is it that at least one of these clients is audited?

Â So we're not saying that just one is audited,

Â we're saying at least one of them.

Â It could be one, it could be two, it could be all three of them.

Â Well we can actually use the compliment rule here to say what's the compliment

Â of at least one of them being audited?

Â It's the probability that none of them are audited.

Â So it's 1 minus 12.5% gives us the 87.5%.

Â The big assumption that we're making here

Â is that these observations are independent of each other.

Â If we have a reason to believe that the likelihood that client one is audited is

Â linked to the likelihood that client two is audited, this is going to break down,

Â we're going to have to use the more general multiplication rule.

Â 2:01

Again, another example of where these addition and

Â multiplication rules comes into play, let's look at service plans.

Â Retailers like to offer them buy a TV, you should buy

Â a service plan to make sure that it's covered in case anything were to go wrong.

Â So let's suppose that 20% of LCD TVs need service once,

Â 10% need service twice, 5% have to be serviced three times or more.

Â All right well, probability that your TV never needs to be serviced?

Â Well, it means that it is not serviced once, it's not going to be serviced twice,

Â it's not going to be serviced three times or more.

Â So one minus each of those probabilities using the complement rule.

Â Probability of your television requiring at least two service calls,

Â which means it could need two service calls or it could need three or

Â more service calls, or being the tip-off for the addition rule.

Â So at the 10% chance of needing two service calls, 5% chance of needing three

Â or more gives us 15% chance of needing at least two service calls.

Â Now in terms of pricing out this warranty plan and

Â we'll look at this as an example a little bit later on in the course.

Â What else do you need to know?

Â This is telling you how frequently televisions are going to need to be

Â repaired.

Â Also probably going to have to have some information in

Â terms of what it costs to do the repairs.

Â From the customer's stand point to make the decision of am I going to pay for

Â the warranty plan?

Â I probably have to have some price information,

Â how much is it going to cost me to pay for

Â this plan versus how much would it cost me just to go out and get a new TV?

Â 3:40

All right, so let's go over to Vegas for a little bit.

Â If we think about having a pair of standard dice, what I've put in this table

Â are the possible ways of rolling different combinations.

Â So everywhere from rolling snake eyes,

Â rolling a two all the up to rolling a 12, so a six and a six.

Â So in the middle of this table, the possible pairs that I've enumerated,

Â this is all the different ways you could roll a particular sum and

Â then with a pair of dice there are 36 possible rolls that you can get.

Â So our probabilities are all based out of 36.

Â So how many different pairs could I get that

Â to add up to a particular number divided by 36?

Â That's going to give me the probabilities.

Â Well if you're playing games like craps, this is how we'd go about calculating

Â the likelihood that you'd win on a particular role.

Â Now so if we look at this table, and

Â what I'm going to look at is the lower corner for a second.

Â These one roll bets.

Â Rolling snake eyes, it's going to pay off for

Â every dollar bet, it's going to pay out 30 to 1.

Â Or rolling two sixes going to pay out 30 to 1.

Â Is that a good bet for you?

Â Or let's look at it a different way, where does the casino,

Â where does the house make it's money?

Â 5:03

All right, well, that's where we gotta look at the idea of the expected value or

Â the expected payout.

Â And mathematically,

Â what that means is what's your expected payout is what's the probability of

Â an event occurring multiplied by the payout associated with that event.

Â And then we're going to add up all of the particular combinations.

Â 5:28

Rolling snake eyes.

Â Chances of that happening.

Â One out of 36, but they're only paying you $30 dollars for every dollar that you bet.

Â So there's that difference instead of the payout matching the probability.

Â That difference is where the casinos are making their money.

Â Because your expected payout, once you take into account your bet you're

Â actually going to be losing money on that one.

Â Yes, it's paying out 30 to one, but

Â the chances of getting it are only one out of 36.

Â You can also as an exercise, you may want to go through that field bet area and

Â take a look at what are the probability of rolling these different numbers?

Â What's the payout associated with rolling these numbers,

Â and am I making any money off of that bet?

Â 6:15

Another game where we can calculate expected payouts

Â relatively easily is in roulette.

Â So roulette, we've got red and

Â black, numbers one through 36 plus zero and double zero.

Â Well if we take a look at the Roulette table if you bet

Â 6:31

on any one of these columns or any one,

Â in this case they look like rows where there are 12 numbers in there.

Â The payout for

Â every dollar you bet they're going to payout, it's going to pay out two to one.

Â And it seems on the surface, well that's not bad, there are 12 numbers but

Â it's not 12 out of 36.

Â It's actually 12 out of 38 are your chances because of 0 and 00.

Â If you were to bet red or bet black, if you were to bet one through 18 or

Â 19 through 36 where it's close to a 50-50 coin flip for

Â you in terms are you going to win.

Â But the odds are actually in the houses favor because all of the probability

Â aren't based on numbers one through 36,

Â they're actually based on 38 numbers when you factor in zero and double zeroes.

Â So that's what going to tilt things in the casino's favor.

Â 7:30

How do we make this decision of how much we should expect a new customer to be

Â worth, and how much we should be willing to invest in that customer?

Â Well, if we go through our expected payout calculation is

Â actually the average customer life time value

Â that our expectation of what our customer's is going to be worth to us.

Â That can give us that upper bound and then in this case that average is around 1,200.

Â That gives us that upper bound for

Â how much we should be willing to spend in terms of managing the relationship.

Â Now obviously, we're going to want to leave room so

Â that we're profiting off of these relationships.

Â But it does establish that upper bound for us.

Â So let me go through an example where again, we're going to apply

Â everything we've talked about so far in term of the probability rules.

Â 8:15

This is referred to as the Monte Hall problem popularized by game

Â shows if you've seen the movie 21 about the MIT card

Â counters going to Vegas there's a scene with the actor Kevin Spacey popularizing

Â this a little bit more, but let me walk through the basics of the game.

Â So Monte Hall problems, you know its a simple game there are three doors

Â Behind two of those doors are goats and behind one of those doors is a new car.

Â Well, you as a contestant, you don't know what's behind any one of these doors.

Â So your going to have make your best guess as to which door you want to select.

Â Door 1, door 2, door 3.

Â And once you've selected a door,

Â the game show host is going to reveal one of the goats.

Â In this case the game show host tells you there's a goat behind door number one.

Â Let's say you select door number two.

Â 9:10

You now have the option of changing the door that you've selected and saying okay,

Â well I've selected door number two,

Â now I know there was a goat behind door number one.

Â Should I change over to door number three?

Â Does it help me?

Â And it's tempting to say that well, no, it doesn't make any difference to me.

Â I've still got two doors, it's a coin flip.

Â So that's one argument.

Â But when work through what happens

Â based on the information that we've been provided.

Â And the way that we're going to formulate this problem saying, how likely is

Â it that I win by changing the door that I initially chose, all right?

Â Well, there are two ways that you could win potentially by changing.

Â One possibility is, you guessed right initially and you win on the change.

Â The other possibility is, you guessed wrong initially and you win on the change.

Â All right, so we've just decoupled this based on whether I guessed right or

Â wrong initially.

Â And let's break that down a little bit further.

Â We're going to use the multiplication rule here.

Â So what's the probability that I guessed right initially and I win on a change?

Â Well, it's a probability that I Win on a change given

Â that I guessed right initially.

Â Multiplied by the probability that I guessed right initially.

Â And if we start filling in some numbers here,

Â well if I guessed right initially and I changed, well now I lose.

Â So that first probability, that's going to be a zero.

Â 10:32

What's the probability that I guessed right initially?

Â Three doors, only one chance for me to guess right, so that's 1/3.

Â All right, well let's look at it the other way.

Â Whats the probability that given that I guessed wrong initially.

Â If I change, well the other goat's been removed,

Â only the car remains, so 100% chance I win.

Â And the probability of selecting a goat initially is 2/3.

Â So if we work through the math on this the probability of

Â me winning when I change my initial selection, is going to be two thirds.

Â 11:18

Alright, then I probably when with no change.

Â Well, how do I do that?

Â It means that the only way for

Â me to do that was I had to guess right initially and win on not changing.

Â But that only happens one-third of the time that I guess right initially.

Â Because the probability that I guessed wrong initially,

Â if I don't change, then I lose and I guess wrong initially two-thirds of the time.

Â So, the probability of winning on a change, two-thirds.

Â Probability of winning without making a change, one-third.

Â You're actually better off taking advantage of the information that's

Â provided to you and making that change.

Â You're actually twice as likely to win.

Â All right, so why does this matter?

Â Well, when we're evaluating the quality of information,

Â if we're looking at analyst predictions, if we're making an investment in acquiring

Â additional information, how valuable is that information going to be to me?

Â It's going to depend on how reliable the information is.

Â It's also going to depend on how much that information costs me to acquire.

Â