Hello. Now, we get into the mid of statistical thermodynamics, the why statistics part. Statistics is the mathematics of variability, and nothing is more variability than dynamic properties at the microscopic level. What we see at the macroscopic level are averages of that variability. If we knew the kinetic energy of every molecule in a given system, then the average kinetic energy would just be given a sum over the individual energies. However, we cannot know all the energies. There are just too many particles. What we will be able to determine are probability distributions. Probability distributions are functions that describe the likelihood or probability of finding a particle in a given state. Distributions can be discrete or continuous. In the two examples, x is the variable of interest. The probability of finding a value of x between the limits a and b are given by summing or integrating over the probability distribution function from a to b. As we shall see, it is convenient to represent rotational and vibrational and electronic motion of molecules with discrete functions and translation with a continuous function. Distributions have some important integral properties. If a distribution function is normalized, then its integral is unity. For example, for the distribution of spatial location, one would hope the particle is somewhere, and thus the integral should be normalized since the particle will be somewhere. The second two integrals are the mean and the variance. These are called moments of the distribution function. In principle, there are an infinite number of moments, and if all the moments were known, the distribution function could be recovered. We have mentioned Gibb's name several times. Gibbs was a professor of physics at Yale University. He played an important role in thermodynamics. Although he died before the development of quantum mechanics, he was able to lay out the framework for modern statistical thermodynamics. Gibbs maybe following observations. The first is that matter is made up of a large number of particles. He just didn't know their details. The second observation is that there is a difference between the time average behavior of a single system and the average behavior of a large number of systems at an instant in time. There are two comments from the preface of this book that I really like. The first is, "Moreover, we avoid the gravest difficulties when giving up the attempt to frame hypotheses concerning the constitution of bacterial bodies, we pursue statistical inquires as a branch of rational mechanics." This was because at the time, there were raging arguments about the nature of atomic and molecular structure and behavior, it was not settled science. The second comment is, "The only error into which one can fall is the want of agreement between the premises and the conclusions, and this with care one may hope in the main to avoid." I call this the old college try approach. Let's try it, I mean, and see if it works. Great, it worked. The physics of the situation are as follows. One, there are large numbers of particles. The particles have kinetic and potential energy and are constantly in motion. Two, and although we have not discussed quantum mechanics, the particles and thus systems are constantly changing quantum state. We will see that left to their own, atoms and molecules will exist in discrete dynamic states called quantum states. We will make a distinction between particle and system quantum state. The importance of this will become apparent as we move forward. The system quantum state is that state for which all the particle quantum states are specified. Next. Either particle or system quantum states are those allowed by the macroscopic constraints, i.e., conservation of energy. For example, neither a single particle or collection of particles can have a total energy greater than that of the system, and secondly, dynamics preclude most allowed states from appearing within a reasonable observation time. We could do a simple calculation to prove this. It takes time for quantum states to change. For example, in a gas, collisions must occur. They occur at a certain average rate at which it would take a time longer than the life of the universe for every state allowed to occur. Three, since equilibrium is observed, most observed states must look familiar. This is a key point. Equilibrium is in fact observed, and then finally, thus equilibrium quantum states, system quantum states must be much more likely to occur than non-equilibrium ones, and this follows from the previous point. So how to find the equilibrium quantum state? One way to find the equilibrium quantum state is to look at the relative probability of each allowed state and characterize equilibrium with the most probable allowed quantum state. Thus, we must find the distribution of quantum states that is most probable. Traditionally, there have been two methods followed to find the most probable allowed quantum state. One is to watch a single system and look at all the allowed states. This requires that we assume that all allowed states are equally likely to appear, which as we just discussed is not the case or we could watch a large number of systems each subject to the same conditions. Thus, if the number of systems is large enough, all allowed quantum states can appear. The problem as we pointed out with the first method is that it is physically unrealistic, Gibbs chose the second method. He chose the concept of the ensemble. An ensemble is a collection of identical systems called members. The members may be connected to reservoir to a reservoir or reservoirs, but the reservoirs are so large that the associated intensive properties are held constant in all members. All members are subject to the same conditions, and all constraints and allowed interactions are the same. The overall ensemble is a closed system. Finally, in this video, we discuss how to find the properties once we know the quantum state distribution functions. Given the functional information we can write averages of U, V and N_i as bracket A bracket and that's notation from statistics what is called the expectation value, but we can just think of it as the average value and that has to be equal to one over N the total number of ensemble members, and then the sum over the number of ensemble members in a given quantum state j times the value of the property we're interested in ensemble member quantum state j, and the indices are given on the slide. In the next video, we will present two postulates that allow us to calculate the equilibrium properties and find the most probable distribution. Have a great day.