[MUSIC] Having introduced a basic understanding of statistics. Let's proceed through statistical thermodynamics. Our goal here is to develop the statistical thermodynamic description of two example ensembles. That is a closed isolated system and a closed isothermal system. Now let's look at the first example, a closed isolated system, that is thermodynamically characterized by its entropy, given as S of U, V, and N. This thermodynamic system is called the microcanonical ensemble when considered in statistical thermodynamics. Now the aim here is to find a statistical distribution for the molecular states of the system that obeys the first and second laws of thermodynamics. The internal energy u identifies the total energy both the kinetic and potential energy e of the molecules in the system. The only states with total energy e are permitted in this ensemble. In addition to this the probability distribution also maximizes the system entropy that's given by the second law of thermodynamics. Let's say that the probability that the system exists in a state mu at any given time is P mu, that is T mu, is the period of time a system exists in state mu and T is the total time. Then the probability P mu tends to P mu divided by T as T dense to infinity. Now in order to satisfy the second law of thermodynamics we need to introduce the definition for the system entropy. This is a key concept and Ludwig Boltzmann proposed a radical idea for that time for defining the system entropy. His suggestion was that the system entropy is simply proportional to the information entropy. And the proportionality constant came to be known as the Boltzmann constant. Remember that at the time Boltzmann proposed this idea, the ideas by Shannon were still 70 years in the future. Now our task is to determine the value of P mu that maximizes the entropy, while maintaining the normalization condition. Now, this means that we need to maximize the entropy S, subject to a constraint that the sum of the probabilities add up to one. This forms a constraint optimization problem, and the technique of Lagrange multiplier is an extremely useful one. The idea here is to maximize the system entropy subject to the constraint. This is done by adding a Lagrange multiplier, lambda not, to the equation. Now maximization occurs when a perturbation, del P new, does not alter the function. That is the first radiation is zero. Now this occurs when, now since del P new is arbitrary, the function is maximized when P mu itself satisfies this equation. This relation says that all of the probabilities are equal, now let's assume the total number of degenerate states is omega. The condition that the total probabilities adds up to one gives the relation that p mu is one over omega. Now, this leads to equal a priori probability in a closed isolated system that is in thermal equilibrium at a given uVNn. The system spends equal amount of time in each state over a sufficiently long period of time. Now the system entropy is simply given by S which is equal to K log omega. Now, this gives complete knowledge of all the thermodynamic behavior of the system, provided it can be evaluated. Now, let's switch gears and consider a second example, a closed isolated system that is thermodynamically characterized by the Helmholtz free energy, F. F of T, V, and N. This thermodynamic system is called a Canonical Ensemble when considered in statistical thermodynamics. Now we need to find a statistical distribution for the molecular states of the system that obeys the First and Second laws of thermodynamics. In this case, the internal energy U identifies the statistical average of the kinetic and the potential energy, that is the average energy E of the molecules in the system. The probability distribution must maximize the system entropy, giving by the second law of thermal dynamics. Now in this ensemble, the total energy U is not uniquely fixed, as a result the summation over the states of the system include all the quantum states of the system with each one weighted by the appropriate probability. Now, in a fixed temperature ensemble the total energy fluctuates about an average value. Such that the average energy E is given as the internal energy U. Now this ensemble needs to enforce two constraints. The first constraint is that the sum of the probabilities is 1, and the constraint is that the internal energy is given as the average of the energy fluctuations. The probability distribution that maximizes the entropy while satisfying these two constraints corresponds to the maximization of the following Lagrange multiplier equation. Note that there are now two Lagrange multipliers, lambda 0 and lambda 1. Now let's first consider the variation of this function to zero which yields the relation. Note that in this case, P mu is the micro-canonical probability multiplied by a factor that depends on the energy E mu. Now the sum of the probability constraint yields the relation. From this equation emerges a quantity called Q, which is called the partition function. Now the probability P mu satisfies. Now there is still the unknown lambda 1. How do we find this? This is found through thermodynamic relations. Let's consider the differential of the entropy S which is given us. The second and the third term in this equation goes to 0 because the summation of mu is equal to 0. Now the differential of the average energy is simply given by, Now from our definition of temperature we know that the change in internal energy as we change the entropy is simply the temperature. This allows us to identify the LaGrange multiplier lambda one as simply one over the temperature. The probability, and the partition function, is now completely specified for the canonical ensemble. The partition function contains complete information about the system. Once the partition function is determined, we can determine every thermal dynamic quantity for the system. Now to summarize, in this module, we made out first attempt at developing a molecular level understanding of thermodynamics. This involves understanding the statistics of large number of molecules. We began by learning the basics of statistics. We showed the equivalence between coin tossing and a random walker. We ended the discussion of statistics by discussing the central limit here. Then, we proceeded to derive the probability distribution that governs two ensembles closed isolated system and a closed isothermal system. In the closed isolated case, we find equal a priori probability with the entropy given by the logarithm of the number of possible states. In the closed isothermal case, we found the emergence of temperature as a Lagrange multiplier. We also identified a new quantity called the partition function which contains complete information about the system.