0:30

So, the conditional probability definition that we saw in previous lecture.

Â Probability of A given B is equal to intersection probability

Â divided by the probability of B.

Â If you want to think of the probability of B given A,

Â it's intersection probability divided by P of A.

Â Now if we do a little algebra and

Â multiply on both sides of this equation with the denominator.

Â So in the first equation, for example, the P of B,

Â we get this multiplication rule that intersection probability of A and

Â B equals the conditional probability of A given B times the probability of B.

Â You look at this math and you say, this looks kind of abstract.

Â Actually, I am sure you have used it, perhaps without knowing it.

Â We use this rule actually all the time.

Â So, let me give try to convince you that this is actually why the intuitive.

Â 2:12

So, look at the numbers.

Â If I tell you 17.6% live in Zurich and from those people,

Â 19.6% of people are young, what would you do?

Â You would say, yeah, 0.176 time 0.196.

Â That's now the fraction of people that have both.

Â They're both young and live in the Canton of Zurich.

Â You do this and you get 3.45%.

Â We have done this I'm sure in our lives a lot.

Â If I give you half a cake and of that cake, I cut it in five pieces and

Â I give you a piece, how much of the total cake did you get?

Â We started out with half.

Â I gave you one-fifth, 20% of that, you would say, 0.2 times 0.5 is 0.1.

Â I just gave you 10% of the cake.

Â So we multiply these probabilities all the time,

Â just as in this example or in the cake example.

Â But now, let's take a look just as everyday

Â calculation in the context of conditional probabilities.

Â 3:18

If it takes these proportions as our definition of probability, the empirical

Â probability concept, concept number two from what we saw in a previous module.

Â Then we would say, the probability that someone is Zurich is 0.176.

Â The probability of someone yet being young in the Canton of Zurich,

Â the probability of 0 to 19 years, given the person is from Zurich is 0.196.

Â And now if we use the multiplication rule, the probability of Zurich and

Â 0 to 19 years old is then to probability of Zurich times the condition

Â probability 0 to 19 years given Zurich, blah, blah, blah, you do some math.

Â 3.45% exactly of what you did if you got feeling before.

Â So, this abstract looking multiplication rule in conditional

Â probabilities is actually an every day concept.

Â A proportion of a proportion and then we multiply.

Â 4:19

Let me remind you once more of the concept of independence.

Â Independence meant to occurrence of in terms of one event does not effect

Â the chances of another event occurring.

Â That was a probability of A is equal to the probability of A given B.

Â If that's not the case, if they're unequal, you say, dependence.

Â Now what happens if we take our multiplication rule and

Â now assume that A, and B are Independent?

Â In that case,

Â the conditional probability of A given B is just the original probability of A.

Â Now replace the conditional probability in that general multiplication rule and

Â you get a specialized multiplication rule, and that's the multiplication rule I

Â showed you in the previous module when we talked about independent events.

Â So probability of A intersection B is equal to P(A) times P(B).

Â Look, that looks much easier than the general multiplication rule.

Â That's why people like this independence events, assumptions and this rule.

Â Lets look at this example where we can easily use.

Â Lets say, you play a dice game, three dice.

Â What's the probability of three ones?

Â One on the first roll, on the second roll and on the last dice.

Â So you want the probability of a one and a one, and a one?

Â Rolling three dice, they are independent, I'm allowed to multiply the probability

Â one-sixth times one-sixth times one-sixth is one and two hundred sixteen and

Â there is nothing special about one, one, one.

Â I can ask you what's the probability of first of one, then a three, then a five.

Â Same math.

Â So you see, rather large complex events like 3 numbers in a row,

Â you can do this for 20 numbers in a row, for 200 numbers in a row.

Â Suddenly, gets very, very easy under the assumption of independence.

Â We can just multiply their probabilities.

Â As simple as this is, we have to be careful in real world applications.

Â There often, we have to ask ourself, is that assumption reasonable?

Â Can we really assume independence?

Â If yes, great for you.

Â We can use the independence multiplication.

Â If however, the answer is no, you are not allowed to use this rule.

Â You may get into real trouble.

Â And later on in this course, I will show you some devastating applications

Â where people assumed independence and terrible real world things happened.

Â Here now, I want to give you a very simple example.

Â Let's say, you have a machine in an assembly line and

Â that machine carries a heavy load.

Â And as a result, it breaks down on average in one out of ten days.

Â On nine out of ten days, it can handle the workload and it works fine.

Â So if we uses historical data now and

Â the equilibrium concept number two, empirical definition,

Â we can now say, the probability of a good day of working is 0.9,

Â of a breakdown is 0.1 and here's now the question.

Â What is the probability that this machine works two days in a row?

Â So if I even don't know anything,

Â I would have to say, use a general multiplication rule.

Â The probability of working well on the first day and

Â working well on the second day.

Â Yes, probability of working well on the first day times the conditional

Â probability working well on the second day,

Â given it worked well on the first day.

Â One probability I know, P of working well on the first day is 0.9.

Â That's from my data, but what's the conditional probability?

Â Now, I'm in trouble.

Â So now, I would love to assume independence.

Â If I have independence, then I can use the simpler rule for

Â independent events at the bottom of the slide,

Â 0.9 times 0.9, 0.9 squared is 0.81.

Â But if I cannot assume independence, then I need more data.

Â Now, what is it?

Â 8:53

If you ask engineers, engineers like to talk about the so-called bathtub curve.

Â A new machine often has breakdowns, because it isn't perfectly calibrated, so

Â the probability is maybe a little elevated and the machine working well for a day or

Â two or three indicate that the machine is calibrated and the probability changes.

Â It reaches a low, it remains constant for a while.

Â And eventually, due to wear and tear, it goes up.

Â In the middle range, independence is an okay assumption.

Â But at the front-end and at the back-end, it isn't.

Â So here, we already see how very trivial application

Â of multiplying probabilities can reside in rather tricky issues.

Â Do we have independence or do we not have independence?

Â And that's crucial for our calculations.

Â And as I said, we will see more cool examples in lectures to come.

Â Let me wrap up this lecture.

Â We have seen the general multiplication rule for

Â conditional probabilities, another special case of independence.

Â This multiplication will greatly simplifies, but

Â be careful assuming independence.

Â Thanks for your attention.

Â Please come back for more fun with probabilities.

Â Thank you.

Â