This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

Loading...

From the course by University of Minnesota

Statistical Molecular Thermodynamics

143 ratings

This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

From the lesson

Module 7

This module is relatively light, so if you've fallen a bit behind, you will possibly have the opportunity to catch up again. We examine the concept of the standard entropy made possible by the Third Law of Thermodynamics. The measurement of Third Law entropies from constant pressure heat capacities is explained and is compared for gases to values computed directly from molecular partition functions. The additivity of standard entropies is exploited to compute entropic changes for general chemical changes. Homework problems will provide you the opportunity to demonstrate mastery in the application of the above concepts.

- Dr. Christopher J. CramerDistinguished McKnight and University Teaching Professor of Chemistry and Chemical Physics

Chemistry

So, at this point, we've armed ourselves with the first and second laws of

Â thermodynamics. And it's time to add the last and third

Â arrow to our quiver. The Third Law of Thermodynamics.

Â Let me recall for you how I closed the last lecture.

Â And that is to note that an entropy change as you go from temperature 1 to

Â temperature 2. Can be determined by integrating the

Â constant pressure heat capacity over that temperature range divided by T.

Â So if we start at 0 Kelvin, that says that we can assign an absolute entropy at

Â a given temperature as the entropy at 0. Plus the integral now ranging from the

Â lower limit of 0 kelvin up to T2. Well, what about 0 kelvin?

Â And so here we have a, a picture of Walther Nemst.

Â Who was awarded the Nobel Prize in chemistry in 1920, part for his work on

Â thermodynamics. And Nernst made the suggestion, based on

Â a number of experimental studies that the change in entropy for chemical reactions

Â approached 0. As the absolute temperature approached 0.

Â So that's the change in entropy of reactance, going to products.

Â As the temperature goes to absolute zero. And Max Planck, who you'll recall from

Â all the way back in the first week when we discussed quantum mechanics had a

Â further refinement on that suggestion. He said that the entropy of a pure

Â substance, approaches 0 at 0 kelvin. And so a statement of the third law of

Â thermodynamics is, every substance has a finite positive entropy, but at 0 kelvin

Â the entropy may become 0. And it does so in the case of a perfectly

Â crystalline substance, and n some sense that is the definition of a perfectly

Â crystalline substance. It is one where the entropy goes to zero,

Â as the temperature goes to zero. So, let's pause for a moment, because

Â we're going to be dealing with entropy. I'd like to give you a chance to review

Â one of the expressions of entropy we've seen up until now.

Â And after you've had a chance to verify that you remember it, we'll return to

Â keep looking at the third law. All right, well, the third law was

Â actually proposed before quantum theory was fully developed.

Â But statistical thermodynamics a la Boltzmann, does provide some molecular

Â insight potentially into the third law. And so, if you remember, one of the

Â expressions for entropy, Boltzman's own expression, is that S equals k log W,

Â where W is a measure of the disorder. And so, at zero degrees Kelvin we do

Â expect that all of the systems in the ensemble will be in the lowest energy

Â state. That is that W would equal 1.

Â You'll recall when everything is in the same state, W takes on the value of one.

Â And in that case, you'd get k log W, k log 1, and the log of 1 is 0.

Â So, sure enough, the entropy would be equal to 0.

Â There is another way of looking at that as well, and that is to say the

Â probabilistic, expression for entropy. Namely, S is equal to minus k, sum over

Â all the states, probability of being in that state times the log of the

Â probability of being in that state. Well, if everything is in one state, the

Â ground state, then the probability, I'll call it zero here p0 is equal to 1, and

Â pj is equal to 0 for all the other states.

Â So you'd get, as you run over the states, 0, 1 times the log of 1, log of 1 is 0.

Â So, that contributes 0 to the entropy. And all these 0s, you get 0 times the log

Â of 0. And while log of 0 is undefined, 0 times

Â that thing, L'Hopital's rule, I think I, I mentioned before, establishes that that

Â too is 0. And so again this is consistent with

Â clones hypothesis that the entropy would be 0 at 0 kelvin.

Â Now there is a question that comes up, what if the ground state has degeneracy?

Â Well, let's take a look at that. Let's imagine that indeed the ground

Â state is degenerate. And so we'll work with the probability

Â expression for entropy. So, if the ground state is n-fold

Â degenerate, then there will be an equal probability, and that probability is 1

Â over n. Of being in any of those n-fold

Â degenerate states. And so, I would be summing, now, to a

Â definite limit I sum from one to n over the n degeneracees.

Â The probability is one over n, so I get the sum one over n log one over n.

Â Alright and so since I'm going to add this together n times, that's like kn

Â this expression. And n times 1 over n of course is just 1.

Â So all the stuff out front goes away except Boltzmann's constant.

Â And I'll change this negative sign to a positive sign by swapping from log 1 over

Â n to just log n. And so it says that the entropy for that

Â n fold degenerate ground state will be k log n.

Â And I'll ask you to remember that Boltzmann's constant is a very, very

Â small value, 1.38 times 10 to the minus 2 3rd jules per kelvin.

Â And so for a single system, even when n is very, very large, that would still be

Â a very, very small number once multiplied times Boltzmann's constant.

Â And so still very, very close to 0. Well, before manipulating things with the

Â third law more and using them to assign third law entropies.

Â It's worthwhile maybe to touch on a little bit of history that I think is

Â pretty interesting with respect to the Third Law.

Â And to do that let me introduce you to William Giaque.

Â So he was a professor of chemistry at the University of California, Berkeley.

Â And he was awarded the Nobel prize in 1949 for his contributions in the field

Â of chemical thermodynamics. Particularly concerning the behavior of

Â substances at extremely low temperatures. And how might you access a very low

Â temperature. We've actually already been exposed to a

Â way to do cooling. And that was to do adiabatic expansions

Â of ideal gases and those gasses cool. And if you then put them in contact with

Â something that's warmer, they'll suck the heat out of that warmer thing until

Â they're at equilibrium. And you can cycle them on and off to keep

Â pulling heat out of something. So let's just look at that process again

Â and reacquaint ourselves. So if we think of the, the vapor cycle

Â for refrigeration, I'll start with some gas and a piston on top of that gas and a

Â container. And if I compress it adiabatically, so I

Â isolate the system, it's not touching anything, it can't exchange heat with

Â anything. So I'll compress it into adiabatic

Â heating. I now have a smaller volume of gas, and

Â I've drawn it to be a little pink red, it's a warm gas.

Â And now I do put it in contact with some surroundings and I let it come into

Â equilibrium with those surroundings at a modest temperature.

Â So I allow heat to flow out of the system, so now I have a compressed gas at

Â a lower temperature. Then I expand it adiabatically and so

Â again I'm isolated so because it's expanding its temperature drops and so

Â it's become a blue gas, blue for cold. And now I place it in contact with

Â something that I would like to take the heat out of, that's at a warmer

Â temperature. Some of that heat will flow into the gas

Â and I can repeat this cycle. I keep dumping the heat into some outside

Â reservoir, where the thing I'm interested in cooling is what I keep taking the heat

Â from. So, I cycle again and again.

Â And I can cool something. And, how cold can I get something?

Â Well, that will depend on the gas. Because at some point, I will get to a

Â temperature of the thing I'm trying to cool, that is equal to the temperature at

Â which that gas liquefies. And I can't do a vapor cycle if I'm now

Â so cold that I don't have any vapor anymore, I just have a liquid.

Â Liquids are not compressible this way, and they don't do this sort of

Â refrigeration. And so you can ask yourself what is the

Â coldest you can get with a gas before it liquefies?

Â And the answer is helium. So helium only liquefies because of

Â dispersion which we've already discussed as, induced dipole, induced dipole

Â interaction. That brings otherwise, non-polar

Â molecules together. And helium liquefies around four degrees

Â kelvin. And actually if you separate the isotopes

Â of helium you can go even a little colder.

Â You can get close to three degrees kelvin with helium three.

Â However, that's it, and you, physicist's for a while thought maybe that's the

Â limit of what you should ever expect to do.

Â You can't get closer, there, there was no obvious process to get closer to absolute

Â zero. And it was Joe who had the idea of a

Â different kind of refrigeration cycle. And its called magnetic refrigeration.

Â And so, for certain materials, magnetic materials, that have magnetic moments

Â associated with molecules, atoms. It doesn't really matter for our purpose,

Â but let's just imagine that we have a material that is highly disordered.

Â So, a magnet has a direction, it's got a north and a south pole, if you like.

Â And at reasonably high temperatures, the interactions between the magnets are,

Â extremely weak. And so, thermally, they can point in all

Â sorts of directions and there's a lot of entropy associated with that, actually.

Â And what Giauque suggested, was, okay, if you place this material in a strong,

Â magnetic field. So that's what, bold face H is here.

Â It's not enthalpy its H representing a magnetic field.

Â Well, the spins, the magnetic magnetic moments I guess I'll call them, I won't

Â call them spins. The magnetic moments will align

Â themselves with the magnetic field. And you will have reduced the entropy

Â substantially. And because this is an adiabatic process,

Â so this label still applies. This is an adiabatic heating, the

Â temperature will go up. So I did this while the system was

Â isolated. And if the entropy went down the

Â temperature had to go up, because there was no heat flow from the outside.

Â At that stage with all the spins aligned, you can put it in contact with something

Â into which it will dump its excess heat. Lower its temperature, come to some new

Â temperature. Now turn the external magnetic field off

Â after isolating the system adiabatically. Well, the spins will, sorry the magnetic

Â moments will return to a state of high entropy; they will unalign with each

Â other. And in the process, because it's

Â adiabatic, the temperature must drop, so the material will cool.

Â At that stage, you can put it in contact with whatever it is that you would like

Â to cool further, maybe it's actually your liquid helium.

Â You'd like to make that liquid helium even colder, and study its properties as

Â it goes below 3 or 4 degrees kelvin. And, we just keep that cycle going.

Â So, you'd turn the magnetic field on, turn the magnetic field off.

Â Constantly isolating to do the adiabatic processes, placing into contact with the

Â bands you either want to cool or dump heat into.

Â And so with that process Giaque himself reached a temperature of 0.25 kelvin, so

Â a quarter of a kelvin. And much, much lower temperatures have

Â since been achieved by this process, which is also known as adiabatic

Â demagnetization. So, temperatures down to a thousandth of

Â a kelvin or lower. And to some extent that sounds a bit, you

Â know to go from a quarter to a thousandth, it's much much less than 1,

Â right? But remember that temperature appears as

Â a quantity in expressions like E to the minus delta G over RT.

Â And the difference between .25 and, say, .0025 is a factor of a hundred.

Â So it's not, it's not just a little bit of a degree, it's a hundred times hotter.

Â So there's enormous differences between what can happen at, say, 1 degree, 2

Â degrees, 3 degrees kelvin. And it's been very interesting to study

Â the properties of materials at these very low temperatures.

Â But more importantly, from, from the point of view of what we began this week

Â discussing. this establishes a way to get effectively

Â arbitrarily close to absolute zero and to have this base entropy, against which to

Â begin adding heat capacity entropies. In order to establish third law

Â entropies, absolute entropies. So we will look at that more as the

Â thermodynamics train continues on down this track.

Â And so next time we'll take a look at Standard Entropy.

Â Coursera provides universal access to the worldâ€™s best education,
partnering with top universities and organizations to offer courses online.