This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

Loading...

From the course by University of Minnesota

Statistical Molecular Thermodynamics

143 ratings

This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

From the lesson

Module 7

This module is relatively light, so if you've fallen a bit behind, you will possibly have the opportunity to catch up again. We examine the concept of the standard entropy made possible by the Third Law of Thermodynamics. The measurement of Third Law entropies from constant pressure heat capacities is explained and is compared for gases to values computed directly from molecular partition functions. The additivity of standard entropies is exploited to compute entropic changes for general chemical changes. Homework problems will provide you the opportunity to demonstrate mastery in the application of the above concepts.

- Dr. Christopher J. CramerDistinguished McKnight and University Teaching Professor of Chemistry and Chemical Physics

Chemistry

All right. This week of the course had a little less

Â material in it than some of the others. But maybe people were ready for a little

Â bit of a break as we come into the home stretch.

Â But, nevertheless, let's take a look at what the most important concepts were.

Â So, first, the entropy variation associated with a change in temperature

Â can be expressed as the, heat capacity. Either at constant volume or at constant

Â pressure, depending on what kind of a system you're looking at, divided by T.

Â And just as integrating heat capacity can be used to determine entropy, the

Â integration of heat capacity divided by temperature is used to determine entropy.

Â The Third Law of Thermodynamics says that at non-zero temperatures, all substances

Â have positive entropies, at 0 Kelvin the entropy of a perfect crystal is equal to

Â zero. An entropy of exactly zero depends on

Â there being a single non-degenerate ground state.

Â So if the ground state has some degeneracy it still might be very, very

Â small but it won't be exactly zero. Exactly zero requires a non-degenerate

Â ground state for which case W, the measure of disorder, is equal to 1.

Â And also the probability, of being in a given state, j, is the kronecker delta

Â ij, where i index is the ground state. So I actually used zero as that index

Â earlier this week, but in any case, there is one state with probability 1, and all

Â other states with probability 0. I told you about the research of

Â professor William Giauque, who generated temperatures very near to absolute 0

Â using adiabatic demagnetization. And that process allows you to go beyond

Â the lowest temperature that's achievable from adiabatic gas expansion.

Â And really come arbitrarily close to absolute zero.

Â Calculating the entropy through this integration process that I, I just

Â mentioned from the last slide, can be written mathematically as.

Â Entropy at a given temperature T, is the integral from zero to the melting point

Â of a heat capacity of a solid. Plus an entropy change associated with

Â melting, plus another integral from the melting point to the boiling point of the

Â heat capacity of the liquid. Plus another phase change, now

Â vaporization, entropy change. And then finally to an arbitrary

Â temperature t of a gas Integrating from the boiling point to that temperature the

Â heat capacity of the gas over the temperature.

Â Rather than having to start exactly at absolute zero, this process can be

Â facilitated by starting at a somewhat higher temperature, still cold, but

Â higher. And noting that the molar entropy at that

Â temperature ought to be equal to the constant pressure heat capacity at that

Â temperature over 3. Because of the Debye T3 law.

Â Entropies that are measured in this fashion, from experimental measurements,

Â are in near quantitative agreement with results.

Â That would be predicted from the partition function using S equals k log

Â partition function plus kT partial log Q, partial T, a constant number and constant

Â volume. Something I don't know that I said when I

Â presented all this by the way is, we tended to only to make this experiment

Â for gases. And that's because, we actually have a

Â pretty good way to construct the partition function for gasses.

Â Many of them behave nearly ideally. You might remember that you saw a

Â correction for Nitrogen that was very, very small

Â So at the temperature and pressure we were looking at it was pretty close to

Â ideal. Now, while we would still expect to see

Â this near quantitative agreement say, for a liquid.

Â The trouble is we might have a much more difficult time coming up with a good

Â partition function for the liquid. And that is not the subject of this

Â course, but suffice it to say that can be quite difficult.

Â But in any case, we get into the gaseous regime, we are golden with statistical

Â molecular thermodynamics. In terms of the degrees of freedom that

Â contribute to entropy. In ordered by quantitative importance the

Â translational energy levels and the ability to access many of them,

Â contributing to disorder. Contribute the most to the entropy of a

Â substance the rotational levels are next in importance, vibrational are lower and

Â electronic excitation even lower. these last 2, and many of the examples we

Â looked at, made virtually no contribution at all so this last greater than symbol

Â is. Somewhat notional in a way, but

Â translation dominant rotation often very important.

Â And again really gasses we're talking about here in other condensed phases we'd

Â need to talk about other phenomena associated with associated with motion.

Â As particle mass increases, the translational entropy also increases and

Â it does do in a logarithmic, logarithmically with the mass.

Â Very stiff, insulating solids, like diamond, have very low entropies, near

Â zero kelvin. And conductors get to low entropies much

Â less rapidly, because they're conductors. Because they have electronic motion that

Â needs to be considered as well. We also looked at, larger molecules than

Â just say monatomic and diatomic gases and in general, the more atoms a molecule

Â has. The greater its entropy at a given

Â temperature because generally that additional number of atoms will bring in

Â increased mass. Larger moments of inertia and perhaps

Â degrees of vibrational freedom where the vibrational temperature is low enough.

Â That they begin to contribute some disorder of their own.

Â Residual entropy is a phenomenon that can be associated with a system that fails

Â experimentally to access a perfect crystal at 0 kelvin.

Â And I used the example of carbon dioxide as one system that exhibits that kind of

Â phenomenon. Justice was true for entropy or is true

Â for any state function. The entropies of reaction are additive

Â and derive from the entropies of products minus entropies of reactants.

Â And last of all, entropies of gases are much, much greater than those of their

Â corresponding condensed phases. And we saw that when we looked at some

Â specific examples of entropies of reaction.

Â Alright, well, we have reached a stage now where we have actually covered all

Â three of the laws of thermodynamics, the first the second and the third.

Â In the last week of the course, we're going to take advantage of all of these

Â laws to look at two additional thermodynamic state functions.

Â And the relationships between all these different state functions, and what we

Â can do, taking advantage of these relationships.

Â So the first of those state functions that we look at will be the Helmholtz

Â Free Energy, and we'll get started on that next week.

Â Coursera provides universal access to the worldâ€™s best education,
partnering with top universities and organizations to offer courses online.