Hi and welcome to my sessional Naive Bayes. Naive Bayes is a very popular technique. It's used everywhere in machine learning. It's a very common in declassification but it has really a wide range of obligations. Before I go into the technical details of Naive Bayes, I would like to do a quick refresher on probabilities. So, let's start with an example. So, here I have, say two boxes and these two boxes contain two types of fruits, apples and oranges, and I have a red box and a blue box. And let's say that I randomly pick one of the two boxes. Let's say that, I pick the red box with the probability of 40 percent, and I pick the blue box with a probability of 60 percent. Right. So before I reach into the box to grab the fruit, I select if I'm reaching into the red or the blue box, and these are the probabilities that I will go right or do. Right. So, I use the following notation. I would say that the probability of B which is my random variable stands for box, being red is, I said 40 percent. So that's four-tenths or 0.4. And the probability of selecting the blue box, so the box being B is 60 percent; 6 over 10. Okay. By definition, probabilities lie in the interval between zero and one. Right. And if events include all possible outcomes and they are mutually exclusive like in this case where, if I'm reaching, if I have selected the red box then I'm not reaching into the blue box and vice versa. So, if they are mutually exclusive, their probabilities will sum up to one. So in our case, that's absolutely true because four-tenths plus six-tenths equals ten-tenths which is one. Right. Now I would like to establish some definitions to give you some definitions because we'll be using them when we talk about Naive Bayes. So, the first one is what we call marginal probability. So, I will write it here, marginal probability. Right. And this marginal probability, that's my first definition, is the probability of the event occurring when the event is not conditioned on any other event? So, in my case the probability of selecting the red box say, doesn't depend on anything. Now it's always 40 percent. So, the probability of B equal red, which is four-tenths is a marginal probability. My second definition will be joint probability. And this is the probability of events occurring together. So for example, what is the probability of selecting a specific fruit from a specific box. So maybe, what's the probability of selecting the red box and the fruit, I will use F as the random variable for my fruit being an apple. A will be apple, r the orange. Right. So, that's a question we could ask. What's the probability of getting an apple from the red box. And this is what we call joint probability. And the third definition that I would like to give you is conditional probability. Okay, and this is the probability of an event occurring given that another event has occurred. Right. So, I can give you an example here. Maybe say, what is the probability of the fruit being an apple given that I have selected the red box. What is the probability of the fruit being apple given that I have already selected the red box. By the way this one is fairly easy to compute because the probability of getting an apple if I have already selected the red box is what the just the fraction of apples in the red box. So, this will be, we have two over six, seven, eight. So that's why I'm for it. Right. So we'll be using these definitions; marginal probability, joint probability, and conditional probability when we talk about Naive Bayes. There is one other thing that I would like to define now and these are the rules of probability. Right.