This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

Loading...

From the course by University of Minnesota

Statistical Molecular Thermodynamics

143 ratings

This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

From the lesson

Module 6

This module introduces a new state function, entropy, that is in many respects more conceptually challenging than energy. The relationship of entropy to extent of disorder is established, and its governance by the Second Law of Thermodynamics is described. The role of entropy in dictating spontaneity in isolated systems is explored. The statistical underpinnings of entropy are established, including equations relating it to disorder, degeneracy, and probability. We derive the relationship between entropy and the partition function and establish the nature of the constant Î² in Boltzmann's famous equation for entropy. Finally, we consider the role of entropy in dictating the maximum efficiency that can be achieved by a heat engine based on consideration of the Carnot cycle. Homework problems will provide you the opportunity to demonstrate mastery in the application of the above concepts.

- Dr. Christopher J. CramerDistinguished McKnight and University Teaching Professor of Chemistry and Chemical Physics

Chemistry

Let's try computing some entropy changes. And we'll look at two systems we've seen

Â before, and a new one as well. So, let me start with the ideal gas

Â expanding into a vacuum. That is not a reversible process.

Â But because entropy is a state function, the change in entropy does not depend on

Â the path. And so if we're interested in the entropy

Â change for this irreversible process, we can compute it if we can compute the

Â entropy change for the reversible process.

Â That is, If I look at delq reversible over T, as I integrate from state point

Â one from state point two, that will be the entropy change for the irreversible

Â process as well. Because it only depends on the state

Â points, one and two. So, what is the change in heat, delq rev,

Â it's equal to dU minus the change in work, so that's the first law.

Â Now, this process was isothermal, we had isolated the system from the surroundings

Â when we talk about the irreversible process.

Â It doesn't have to be isothermal in the reversible case, in fact, we'll see it,

Â it can't be. But that's okay, we want to maintain

Â however, del dU is equal to zero. So given, that the change in internal

Â energy is equal to zero for isothermal, because an ideal gas is energy only

Â depends on temperature. That means that the reversible heat

Â change must be equal to minus the reversible work.

Â We know for an ideal gas what the reversible work is, minus nRT over V, dV.

Â And so when I go and plug all that in, so I want to know the integral from one to

Â two of delq rev over T. Well, that's minus the integral from one

Â to two of delw rev over T. Here's my expression, so the Ts drop out.

Â So, when I talk about going from one to two, I'm actually talking about the

Â change in volume. That's what's left in this integral in dV

Â over V and I get nR log V2 over V1. So, we've actually seen that expression

Â before for the expansion of an ideal gas. And you might think to yourself, well,

Â what was the difference between the irreversible process and the reversible

Â process? This is often a point of confusion.

Â And the difference is, what happened to the surroundings?

Â So, in the reversible case in order for the system to be isothermal, heat had to

Â be added to the system. Work is being done during the reversible

Â expansion. The pressure is growing, growing, growing

Â inside that other vessel, as I'm allowing myself to expand into it.

Â And so, to drive that example, if you will, heat must flow into the system.

Â It's done infinitesimally slowly, so you could imagine that piston with sand on it

Â again, for example. ne, never the, and that's why the

Â external pressure is equal to the ideal gas pressure.

Â But, okay that's an imaginable experiment.

Â And so, if this was the reversible work it's also the reversible heat, there's a

Â sign change in there. So, the gas absorbs heat from the

Â surroundings and the entropy of the surroundings must decrease as a result,

Â right? Because for the surroundings, there is

Â negative heat. It is giving up heat to the, the system,

Â which is receiving it, receiving it in a positive quantity.

Â So, delS surroundings is minus q reversible over T.

Â This is reversible, it must be equal and opposite, so I get minus nR log V2 over

Â V1. So, the total change in entropy of the

Â system plus the surroundings? Zero.

Â Which is must be it was a reversible process, no change in entropy.

Â No change in total entropy, always expected for a reversible process.

Â Focusing now on the irreversible case, when I open that stopcock all at once and

Â it's isothermal, there is no external pressure.

Â So, the irreversible work, zero, and the system was isolated, so the irreversible

Â heat flow, zero. Since there's no exchange of heat with

Â the surroundings, there's no change in entropy for the surroundings.

Â The surroundings doesn't even know what happened it's, that system was isolated

Â from the surroundings. And so the net entropy change, system

Â plus surroundings, well, the system, it was a state function, we already know

Â what happened in the system. NR log V2 over V1, but surrounding zero.

Â So, for the universe then of system plus surroundings, there has been a net

Â increase in entropy. And that's consistent with our statement

Â of the second law, the reversal process, there was no change.

Â The irreversible process there was a net increase in universal entropy.

Â So, total entropy increases as expected, irreversible process.

Â I will add as a sort of a technical note, we used the irreversible heat to compute

Â the change in entropy of the surroundings.

Â I mean in a sense it was, it was zero so maybe that doesn't seem very unnatural

Â but typically you, you can't do that. We get away with it in this case because

Â there is zero work and so from a technical standpoint, that made the heat

Â a state function. We didn't have to worry about whether it

Â was reversible, irreversible. A, the fact that it was zero is sort of a

Â benefit as well. But that's, you know, something that's

Â good to born in mind if your really looking at the nitty gritty of the thermo

Â dynamics. Okay, let's take a moment and I will let

Â you, a, answer a question on this front and then we'll move on to consider

Â entropy of mixing. Let's look at the second example we've

Â already considered, and that is two vessels with different gasses that are

Â then opened one to another. So, for each of the two gases, I've, I've

Â only got two in this instance, but if I were to generalize.

Â And have i different gasses and all allowed to enter one another's volume at

Â a given point. What we've already worked out, again and

Â again, what the entropy change is, as I expand from a given volume to a different

Â volume. It is, for each individual gas, the

Â number of moles of that gas times R times the log sum over all the volumes that are

Â now accessible. And maybe there's j different flasks that

Â are all interconnected, divided by the original volume.

Â And if I, for purposes that'll become apparent in a moment, if I want to put a

Â minus symbol out front, I can put this sum in the bottom.

Â So, all I've done is inverted the argument and hence, change the sign of

Â the logarithm. However, for ideal gases all at the same

Â temperature, the volume is actually proportional to the number of moles.

Â And so, I can replace where volume appears with number of moles.

Â So, I get the change in entropy for a given gas i is minus the number of moles

Â of i R log number of moles of i divided by the total number of moles in the

Â system. And that's sometimes referred to as the

Â mole fraction. So, I'm going to indicate that by yi, yi

Â is number of moles divided by total number of moles spoken as mole fraction.

Â So this entroy of mixing is minus R n sub i log y sub i.

Â So notice, that the mole fraction as long as there's more than one gas, is always

Â less than 1. So, the log of a number less than 1 is

Â always a negative number. I'm preceded by a negative sign.

Â Number of moles is positive, R is positive, and so the entropy of mixing is

Â always positive, mixing is always spontaneous.

Â Let's do one last example, here's one we haven't seen before.

Â Imagine that I have two identical pieces of a metal bar, common material.

Â Maybe it's copper, maybe it's manganese, pick your favorite metal.

Â They're at different temperatures, hot and cold, represented brilliantly here

Â with a red piece and a blue piece. So, the red one is hot, it's a Th, and

Â the blue one is cold, it's a Tc. And I bring them together, I touch them.

Â So, we know that heat will flow between those two systems.

Â It's a bit like our isolated compartments we did previously, but now this is more

Â practical, two metal bars. There will be almost no change in the

Â volume of these bars as long as I don't have, you know, thousand of degree

Â changes. So, the work change is negligible because

Â delta V is negligible. And so the reversible heat change is

Â equal to dU, it's equal to just dq because given no work, q becomes a state

Â function. We can talk about reversible or

Â irreversible, it doesn't matter. There's no work, so you don;t have to

Â worry about a path for q anymore. And it's equal to the heat capacity times

Â dT. So, if, just to make the math a little

Â bit more convenient here. Let's assume that the heat capacity, over

Â the temperature range we're interested in, is independent of temperature.

Â It takes the same amount of heat to go up one degree, the next degree, the next

Â degree after that, it's just a constant heat capacity.

Â So in that case, delta q, the total heat transfer, is going to be the heat

Â capacity times the, initial temperature minus the final temperature.

Â All right? Whatever temperature I, I finish up at.

Â Moreover, the heat lost has to be equal to the heat gained.

Â So, CV times the hot temperature minus the final temperature, must be equal to

Â CV times the cold temperature minus the final temperature.

Â And that means that the final temperature here, is not a rocket science equation.

Â It means that it's gotta be the average of the original two temperatures and I

Â suspect there, there might be a sign error here.

Â I probably want to switch this around and make this negative, one is heat in, one

Â is heat out. But I, I think the final conclusion that

Â the average heat is the final temperature the, the average temperature, is the

Â final temperature, is a conclusion that seems obvious.

Â So then, what is the change in entropy? Well, it is for each of these bars, the

Â integral from the initial temperature, to the final temperature.

Â Irrespective of whether that initial temperature is the hot or the cold dq

Â over T. I don't have to write reversible here

Â because heat is a state function in this case, there's no work.

Â So, I now replace dq with Cv dT and I end up with the integral from initial to

Â final temperature of dT over T. So that Cv log, final temperature over

Â initial temperature. Okay, so, let's keep working with that

Â and think about now rod by rod. So, what is the, this integral in the

Â case of the cold rod. Well the final temperature is Tc plus Th

Â over two. The initial temperature is the cold

Â temperature, Tc, and so delta S is Cv times log of this quantity.

Â Now let's do the hot rod. Well, in the hot rod case, we don't mean

Â a very fast car in this instance, by the way, old use of hot rod it's Tc plus Th

Â over two is the final temperature. The original temperature is Th, so these

Â two look the same, except one has a c in the bottom and one has an h in the

Â bottom. And so the total entropy is the sum of

Â the two of these. A sum of logarithms is like a log of a

Â product. And so if I take the product of the

Â numerators I get a Tc plus Th quantity squared.

Â If I take the product of the denominators I get four Tc, Th.

Â And I can ask now, will this be spontaneous?

Â It will be spontaneous if delta S is greater than zero.

Â Well, how can I establish that? Here's a little proof that this is

Â greater than one, right? And in order to prove that I need the

Â numerator to be great than the denominator.

Â If that's always true, then I have log of a number greater than one, delta S will

Â be greater than zero. The heat flow will be spontaneous.

Â So how can I prove that? Here's a quick little proof.

Â Consider T C minus T H squared. Well, since that's a number squared it's

Â greater than zero unless they start at the same temperature.

Â But then I don't worry about heat flow I expand this squared quantity, Tc squared

Â minus twice the cross term plus Th squared is greater than zero.

Â Add to both sides 4Tc Th. So that means minus 2 goes to plus 2, and

Â this side ends up with 4Tc Th. but what is this side now, oh look it's

Â Tc plus Th all squared. So, I have just proven that the numerator

Â Tc plus Th squared Is greater than the denominator 4TcTh.

Â So, indeed, as we would expect based on our life experience.

Â Yes, indeed, those two rods come to an equilibrium temperature that's halfway in

Â between their original temperatures. Great.

Â Well, that has been three different systems where we've seen how we can use

Â reversible processes to learn about the entropy change.

Â Even associated with an irreversible process, knowing that there will be a

Â difference in total entropy of the universe.

Â But we know how to work with reversible changes and that allows us to accomplish

Â things and actually do practical computations.

Â we're going to go on, and next consider the relationship of entropy to the

Â partition function. Now before we actually get to that

Â though, I think it is time to do another demonstration.

Â And a particularly interesting one that seems to violate briefly the entropy of

Â mixing. I'll call this demo, the anti-entropy of

Â mixing. I hope you enjoy it.

Â In this demonstration, we'll see that sometimes we can fight thermodynamics,

Â almost to a draw, even if we can never prevail against one of the three laws.

Â On the bench here, I have a cylindrical flask, inside of which is another

Â cylinder of only slightly smaller diameter.

Â In between the two is a fluid that is clear and colorless.

Â At one position along the cylinder, a line of dye has been added to the fluid.

Â Do you see? What will happen if I turn the inner

Â cylinder? Certainly we know that the entropy of

Â mixing will favor distribution of the dye uniformly throughout the available

Â volume. So, let's see what happens.

Â Sure enough, as I turn the cylinder, the dye distributes, and, by the time I've

Â done about two, revolutions. The color seems to be widely distributed

Â throughout, even if not yet completely uniform, about as one might expect.

Â Now, what will happen if I reverse my turning of the inner cylinder?

Â We know that mixed substances cannot spontaneously separate.

Â Unless there is some kind of phase change.

Â So, the normal expectation would be for nothing obvious to happen, but let's try.

Â Wow. All the dye has returned, almost

Â perfectly, to its original position. What's happened?

Â Have we beaten entropy? The answer is no.

Â Although your eyes have been tricked into thinking it true.

Â The trick that is involved ,is that the very thin layer of fluid between the two

Â cylinders causes the flow in the liquid to be extremely uniform.

Â With little exchange between adjacent small volume elements.

Â The apparent homogeneous mixture that was created by our initial rotation of the

Â inner cylinder, was not actually homogeneous.

Â But simple uniformly distributed the dye around the circumference, with the

Â possibility of bringing it back through reversal of the motion.

Â If we were to measure carefully, we would see that there has been some broadening

Â of the dye line, driven by entropy. And we were to repeatedly cycle back, and

Â forth, with each step the mixing would become more complete.

Â Until indeed we did end up with a homogeneous mixture of the dye in the

Â fluid. So remember, the second law does allow

Â for reversible processes having zero entropy change.

Â While they're often hard to engineer in practice, they are not impossible.

Â And this device is one example, that illustrates a process that is very nearly

Â reversible.

Â Coursera provides universal access to the worldâ€™s best education,
partnering with top universities and organizations to offer courses online.