0:13

In this lecture we will learn about Bayes Rule,

Â a rather complicated looking formula that has many applications.

Â But don't worry, you don't actually have to use a complicated formula or

Â memorize it.

Â I will show you how we can easily do the calculations in probability tables and

Â how it's actually rather easy to do the relevant calculations.

Â Before we look at an ugly formula, let's look at a small,

Â simple example to introduce some ideas.

Â 1:04

Good parts have probability of 90%, the opposite,

Â bad parts, have a probability of 10%.

Â The company looked at some recent data to see is

Â there any difference between our suppliers, S, T, and U.

Â And here's data, among the good parts, 60% came from S,

Â 25% from T, and the last 15% from the supplier U.

Â Among the bad parts, 40% came from the supplier S, and 30% each from T and U.

Â 1:41

And now our manufacturer is asking the following question.

Â Which of my suppliers delivers the best parts,

Â the fewest defective parts, the fewest proportion of bad parts?

Â And which is my worst supplier?

Â So in the language of probability, it's asking the following question.

Â What's the probability of a good part given that it comes from supplier S?

Â What's the probability of a good part it came from T?

Â What's the probability of a good part it came from U?

Â We can't answer that question yet because let's look at our data.

Â Our data set tells us something different.

Â 2:28

We learned probability of good, 0.9, probability of bad, 0.1.

Â And then the other probabilities are conditional probabilities, but

Â they're the wrong way.

Â We have the probability of S given good because we were told,

Â among the good parts, 60% came from S.

Â And so on at the bottom of the slide, you see the other numbers.

Â But what we are interested in is the probability of good given S.

Â So what should we do?

Â Let's create our probability table with the data that's given.

Â Notice we have good and bad parts on the one hand, and

Â we have three suppliers, S, T, and U.

Â Good and bad we have given these numbers, 0.9 and 0.1.

Â So we can fill in the right margin of our little table, and

Â the sum definitely should be one.

Â 3:25

In the interior, we can now use the general multiplication rule for

Â dependent events to create the intersection probability.

Â For example, the probability of a part good and

Â S Is 0.6 x 0.9, and so on, all the way to

Â probability of bad and U, 0.3 x 0.1.

Â So we can fill in the probabilities in the interior,

Â add up every column, and what do we get?

Â Voila, there's out complete probability table.

Â So we see 58% of all tasks are from Supplier S,

Â 25.5% of all tasks are from Supplier T and

Â the remaining 16 and a half % are from Supplier U.

Â Now we can calculate the probabilities we really care about, here we go.

Â Probability of good given S, remember how we do this.

Â We take the probability from the interior, in our case 0.54,

Â the joint probability and divide by the margin probability of supplier S,

Â 0.58, and what do we learn?

Â 93.1% of the parts that come from supplier S are good or

Â more formally, in the language of probability,

Â probability of good given S is 93.1.

Â You see all the calculation and what do we learn?

Â We see that the proportion of good parts is the largest for supplier S.

Â Put differently, the proportion of bad parts is the smallest for

Â supplier S and supplier U actually it's over us in this little case.

Â 5:27

We flipped the conditional probabilities, this casual language

Â of flipping probabilities around is actually very popular among people in

Â probability theory, and so that's why I use this everyday language.

Â So we were given the probabilities S given good, U given bad and so on.

Â And in the end, we calculated the reverse condition of probabilities of

Â good given S, good given T all the way to bad given U.

Â 6:01

And this now actually there's as a very general approach of what we did in this

Â little toy example.

Â There's actually a general rule which does exactly what we just did,

Â and that is famous based rule.

Â Let me derive it for you.

Â Recall the General Multiplication Rule that we saw before for

Â conditional probabilities, which is intersection probability

Â equals the conditional times the probability of the condition.

Â If we use this and

Â once used event B as a condition, and

Â once use event A as a condition and then set these two right hand sides equal,

Â 6:44

we see the formula at the bottom of the slide and probe it.

Â So probability of A given B Equals the probability of

Â B given A times probability of A divided by probability of B.

Â Notice on the left we have probability of A given B, and

Â on the right, we have the probability of B given A so,

Â if you give me B given A, and the multi-probabilities

Â I can flip around to condition using this formula.

Â 7:19

Now, we can go a step further and say, that sometimes you may not

Â have the probability of B, and in that case and that if you would.

Â It also happened in our little calculation.

Â We can calculate it by adding up the elements in a probability table here for

Â completion I gave you there at the formula at the top.

Â If you don't take this formula and put it into the flipping formula.

Â We get the formula at the bottom with the yellow background.

Â And that's the famous base rule for two events.

Â In our case, we just have had good and bad,

Â when general language A and A complement.

Â And what this base rule says, knowing the conditional probability of B given A and

Â B given the complement of A, we can flip around the conditionals and

Â calculate the probability of A given B and

Â B could also calculate the probability of the compliment of A given B.

Â 8:21

There's nothing special about our example.

Â Good and bad or A and A compliment, this scales up.

Â Remember, our probability tables can be as large as we want and

Â the same is true for Bayes Rule.

Â So if you have no M not just two but

Â the larger number m of mutual exclusive and totally exhaustive events.

Â Then here you see the formula of scales up, and

Â as you see it looks kind Hardly.

Â And so this formula usually scares my students.

Â And so I learned over the years to de-emphasize the actual rule and

Â this nasty looking formula and so teachers via simple examples.

Â As example I showed you at the beginning, the best way is fill in the probability

Â table, and then ask yourself, which conditional probability do I really need?

Â Calculated using the definition of conditional probability, and

Â that's essentially your applying this complicated looking formula.

Â 9:47

This concludes our module on conditional probabilities.

Â As I said in the beginning of the module,

Â this is not an easy concept, I showed you a few example.

Â In the next module I will focus very much on the concepts of

Â conditional probabilities, dependence and independence in the context of some

Â real world problems, I could say real world disasters that happened.

Â And you will see that these concepts, maybe you still think they're very

Â abstract, and you think do I really need this in everyday life?

Â The answer is yes.

Â Yes, and I will show you some cool applications so please come back for

Â our next module.

Â Thank you very much and

Â enjoy the session with the TA on calculating some probabilities.

Â