0:19

Indeed, the Entropy Maximization Principle is

Â fundamentally a Convex Optimization Problem.

Â The Convex Optimization Problem is guaranteed because the way in

Â which we've defined the entropy, we do guarantee a global maximum and

Â this actually found by the Lagrange Multipliers method.

Â 0:43

Now, the Principle of Maximum Entropy states that subject to precisely stated

Â prior data, such as a proposition that expresses testable information.

Â The probability distribution which best represents

Â the carnal state of knowledge is the one with maximum entropy.

Â 1:05

Now, another way of stating this, take precisely stated prior data or

Â testable information about a probability distribution function.

Â Now, consider a set of trial probability distributions that would encode

Â the prior data that you've collected of those, the one with maximum

Â information entropy is the proper distribution according to this principle.

Â 1:32

The principle was first expounded by Edwin James in two papers in 1957,

Â where he emphasized a natural

Â correspondence between Statistical Thermodynamics and Information Theory.

Â In particular, James offered a new and very general rationale

Â of why the Gibbesian approach of Statistical Thermodynamics works.

Â Now, he argued that the entropy of Statistical Thermodynamics is principally

Â one and the same with the information entropy concept in Information Theory.

Â Consequently, Statistical Thermodynamics should be seen just as a particular

Â application of a general tool of logical inference and information theory.

Â 2:30

Now, this is a wonderful question.

Â Let's discuss this with an example.

Â Now, take all the air in a room and start it out with a special configuration.

Â The special configuration is where we only occupy a small corner of the room.

Â In a short amount of time, the molecules will spread out over the room and

Â occupy the full volume of the room.

Â Now can the opposite happen?

Â No.

Â Because it violates the Second Law of Thermodynamics, right?

Â The entropy of a confined gas is less than the entropy of the gas that occupies

Â the entire room.

Â But, if we time reverse every molecule of the final state,

Â the air will rush back to the corner of the room.

Â Now, the problem is, that if we make a tiny error in the motion of just a single

Â molecule the error grows exponentially with what is known as the exponent.

Â And instead of going back into the corner of the room they actually go and

Â fill the entire room.

Â Now, this is not to say that freak accidents don't happen.

Â Now, if we wait long enough the air in the room

Â will accidentally congregate in the corner.

Â The correct statement is not that unlikely things never happen, but

Â only that they very rarely happen.

Â The time that you would have to wait for

Â the unusual air event to take place is exponential in the number of molecules

Â 4:08

Now that's a great question.

Â The many physical phenomena of interest that involve

Â quasi-thermodynamic processes that are slightly out of equilibrium.

Â Let's take some examples.

Â Now, heat transport by the internal motions in a material,

Â driven by a temperature imbalance.

Â Now electric currents, carried by the motion of charges,

Â in a conductor, driven by a voltage imbalance.

Â Spontaneous chemical reactions, driven by a decrease in free energy.

Â Friction, dissipation, quantum decoherence, and so on.

Â Now all of these processes occur over time with characteristic rates.

Â And these rates are of crucial importance in engineering.

Â Now, the field of what is called

Â Non-equilibrium Statistical Thermodynamics concerns itself with

Â understanding these non-equilibrium processes at the microscopic level.

Â Now, Statistical Thermodynamics, the way we have learned it can only

Â be used to calculate the final result after all these external imbalances

Â have been removed and the ensemble settles back in equilibrium.

Â In principle, Non-equilibrium Statistical Thermodynamics could be exact and

Â ensembles, for instance, for an isolated system could be evolved over time

Â according to the doministic equations such as the Louisville's Theorem, or

Â the Quantum Mechanical Version, the Fornierian Equation.

Â Now, in order to make headway in to modeling this irreversible processes its

Â necessary to add additional ingredients besides probability and

Â reversible mechanics.

Â Now, Non-equilibrium Statistical Thermodynamics is therefore still

Â an active area of theoretical research as the range

Â of validity of these additional assumptions continue to be explored.

Â 6:17

Now, this is a tricky question.

Â As we have learned,

Â there are Thermodynamic Potentials that govern the behavior of specific ensembles.

Â And these Thermodynamic Potentials are state variables.

Â Now an important theorem holds for

Â the state variables that the second order partial derivatives with respect to these

Â potentials, do not depend on the order in which you perform the derivative.

Â Let's take an example.

Â Let's take internal energy U.

Â Now, we can take a second order derivative of the internal energy U

Â with respect to the entropy and the volume.

Â Now we can take this first with respect to volume, then with respect to entropy,

Â or we can take it with entropy and then with respect to volume.

Â Now, why is this important?

Â It turns out that with this,

Â you can connect the partial derivative of temperature with respect to volume

Â to the derivative of the pressure with respect to entropy.

Â Now, these relations are what are known as maximal relations.

Â Now, the maximal relations are very useful for relating difficult to define

Â Thermodynamic Quantities to ones that are more easily determined.

Â In particular, changes in entropy, as you pointed out are difficult to find.

Â So, it's easy to relate this to changes in pressure, volume or temperature.

Â 8:40

Now, the Chemical Potential also provides a characteristic energy, that is,

Â the change in energy when one particle is added to the system Holding of course

Â entropy and the volume constant.

Â 8:53

Now, I have to add that these three assertions need to be qualified

Â by the contextual conditions under which they have been framed.

Â Now, the first statement captures an essence especially when

Â the temperature is uniform.

Â Now, if this is not the case and the temperature varies spatially,

Â diffusion is somewhat more complex.

Â Now, the second statement that we described

Â is valid if the temperature is uniform and fixed.

Â 9:24

If instead the total energy is fixed, and

Â the temperature may vary from place to place then it turns out that

Â the Chemical Potential divided by the temperature measures this contribution.

Â Now, when one looks for conditions that describe chemical equilibrium,

Â one may focus on each locality separately and then,

Â the division by temperature is inconsequential.

Â 9:48

Now, the system's external parameters are the macroscopic environmental parameters.

Â Such as the external magnetic field or the container volume

Â that appear in the energy operator, or the energy eigenvalues.

Â Now, all external parameters are to be held constant

Â when the derivative in statement three that we described is formed.

Â The subscript V, that is,

Â volume, illustrates merely the most common situation.

Â Note that the pressure does not appear in the Eigenvalues.

Â So, in the present usage, pressure is not an external parameter.

Â