On this page:
This is a collection of excerpts (with a few additions)
from the Thermodynamics of chemical equilibrium.
Disorder is more probable than order because there are so many more ways of achieving it. Thus coins and cards tend to assume random configurations when tossed or shuffled, and socks and books tend to become more scattered about a teenagers room during the course of daily living. But there are some important differences between these large-scale mechanical, or macro systems, and the collections of sub-microscopic particles that constitute the stuff of chemistry.
[image source] See Frank Lambert's page "Shuffled Cards, Messy Desks, and Disorderly Dorm Rooms — Examples of Entropy Increase? Nonsense!"
The importance of these last two points is far greater than you might at first think, but to fully appreciate this, you must recall the various ways in which thermal energy is stored in molecules hence the following brief review.
Thermal energy is the portion of a molecule's energy that is proportional to its temperature, and thus relates to motion at the molecular scale. What kinds of molecular motions are possible? For monatomic molecules, there is only one: actual movement from one location to another, which we call translation. Since there are three directions in space, all molecules possess three modes of translational motion.
For polyatomic molecules, two additional kinds of motions are possible. One of these is rotation; a linear molecule such as CO2 in which the atoms are all laid out along the x-axis can rotate along the y- and z-axes, while molecules having less symmetry can rotate about all three axes. Thus linear molecules possess two modes of rotational motion, while non-linear ones have three rotational modes.
Finally, molecules consisting of two or more atoms can undergo internal vibrations. For freely moving molecules in a gas, the number of vibrational modes or patterns depends on both the number of atoms and the shape of the molecule, and it increases rapidly as the molecule becomes more complicated.
The relative populations of the quantized translational, rotational and vibrational energy states of a typical diatomic molecule are depicted by the thickness of the lines in this schematic (not-to-scale!) diagram. The colored shading indicates the total thermal energy available at a given temperature. The numbers at the top show order-of-magnitude spacings between adjacent levels. It is readily apparent that virtually all the thermal energy resides in translational states.
Notice the greatly different spacing of the three kinds of energy levels. This is extremely important because it determines the number of energy quanta that a molecule can accept, and, as the following illustration shows, the number of different ways this energy can be distributed amongst the molecules.
The more closely spaced the quantized energy states of a molecule, the greater will be the number of ways in which a given quantity of thermal energy can be shared amongst a collection of these molecules.
The spacing of molecular energy states becomes closer as the mass and number of bonds in the molecule increases, so we can generally say that the more complex the molecule, the greater the density of its energy states.
At the atomic and molecular level, all energy is quantized; each particle possesses discrete states of kinetic energy and is able to accept thermal energy only in packets whose values correspond to the energies of one or more of these states. Polyatomic molecules can store energy in rotational and vibrational motions, and all molecules (even monatomic ones) will possess tranlational kinetic energy (thermal energy) at all temperatures above absolute zero. The energy difference between adjacent translational states is so minute that translational kinetic energy can be regarded as continuous (non-quantized) for most practical purposes.
The number of ways in which thermal energy can be distributed amongst the allowed states within a collection of molecules is easily calculated from simple statistics, but we will confine ourselves to an example here. Suppose that we have a system consisting of three molecules and three quanta of energy to share among them. We can give all the kinetic energy to any one molecule, leaving the others with none, we can give two units to one molecule and one unit to another, or we can share out the energy equally and give one unit to each molecule. All told, there are ten possible ways of distributing three units of energy among three identical molecules as shown here:
Each of these ten possibilities represents a distinct microstate that will describe the system at any instant in time. Those microstates that possess identical distributions of energy among the accessible quantum levels (and differ only in which particular molecules occupy the levels) are known as configurations. Because all microstates are equally probable, the probability of any one configuration is proportional to the number of microstates that can produce it. Thus in the system shown above, the configuration labeled ii will be observed 60% of the time, while iii will occur only 10% of the time.
As the number of molecules and the number of quanta increases, the number of accessible microstates grows explosively; if 1000 quanta of energy are shared by 1000 molecules, the number of available microstates will be around 10600— a number that greatly exceeds the number of atoms in the observable universe! The number of possible configurations (as defined above) also increases, but in such a way as to greatly reduce the probability of all but the most probable configurations. Thus for a sample of a gas large enough to be observable under normal conditions, only a single configuration (energy distribution amongst the quantum states) need be considered; even the second-most-probable configuration can be neglected.
The bottom line: any collection of molecules large enough in numbers to have chemical significance will have its therrmal energy distributed over an unimaginably large number of microstates. The number of microstates increases exponentially as more energy states ("configurations" as defined above) become accessible owing to
Energy is conserved; if you lift a book off the table, and let it fall, the total amount of energy in the world remains unchanged. All you have done is transferred it from the form in which it was stored within the glucose in your body to your muscles, and then to the book (that is, you did work on the book by moving it up against the earths gravitational field.) After the book has fallen, this same quantity of energy exists as thermal energy (heat) in the book and table top.
What has changed, however, is the availability of this energy. Once the energy has spread into the huge number of thermal microstates in the warmed objects, the probabiliy of its spontaneously (that is, by chance) becoming un-dispersed is essentially zero. Thus although the energy is still there, it is forever beyond utilization or recovery.
The profundity of this conclusion was recognized around 1900, when it was first described at the heat death of the world. This refers to the fact that every spontaneous process (essentially every change that occurs) is accompanied by the dilution of energy. The obvious implication is that all of the molecular-level kinetic energy will be spread out completely, and nothing more will ever happen. Not a happy thought!
Everybody knows that a gas, if left to itself, will tend to expand and fill the volume within which it is confined completely and uniformly. What drives this expansion? At the simplest level it is clear that with more space available, random motions of the individual molecules will inevitably disperse them throughout the space. But as we mentioned above, the allowed energy states that molecules can occupy are spaced more closely in a larger volume than in a smaller one. The larger the volume available to the gas, the greater the number of microstates its thermal energy can occupy. Since all such states within the thermally accessible range of energies are equally probable, the expansion of the gas can be viewed as a consequence of the tendency of thermal energy to be spread and shared as widely as possible. Once this has happened, the probability that this sharing of energy will reverse itself (that is, that the gas will spontaneously contract) is so minute as to be unthinkable.
Imagine a gas initially confined to one half of a box. We then remove the barrier so that it can expand into the full volume of the container. We know that the entropy of the gas will increase as the thermal energy of its molecules spreads into the enlarged space.
In terms of the spreading of thermal energy, the following diagram may be helpful. (The pink shading represents the thermally-accessible range of microstates for a given temperature.)
The tendency of a gas to expand is due to the more closely-spaced thermal energy states in the larger volume .
Mixing and dilution really amount to the same thing, especially for idea gases.
Replace the pair of containers shown above with one containing two kinds of molecules in the separate sections. When we remove the barrier, the "red" and "blue" molecules will each expand into the space of the other. (Recall Dalton's Law that "each gas is a vacuum to the other gas".) But notice that although each gas underwent an expansion, the overall process amounts to what we call "mixing".
What is true for gaseous molecules can, in principle, apply also to solute molecules dissolved in a solvent. But bear in mind that whereas the enthalpy associated with the expansion of a perfect gas is by definition zero, ΔH's of mixing of two liquids or of dissolving a solute in a solvent have finite values which may limit the miscibility of liquids or the solubility of a solute.
But what's really dramatic is that when just one molecule of a second gas is inroduced into the container ( in the larger diagram above), an unimaginably huge number of new configurations become possible, greatly increasng the number of microstates that are thermally accessible (as indicated by the pink shading above).
Just as gases spontaneously change their volumes from smaller-to-larger, the flow of heat from a warmer body to a cooler one always operates in the direction warmer-to-cooler because this allows thermal energy to populate a larger number of energy microstates as new ones are made available by bringing the cooler body into contact with the warmer one; in effect, the thermal energy becomes more diluted.
Part a of the figure is a schematic depiction of the thermal energy states in two separated identical bodies at different temperatures (indicated by shading.)
When the bodies are brought into thermal contact (b), thermal energy flows from the higher occupied levels in the warmer object into the unoccupied levels of the cooler one until equal numbers are occupied in both bodies, bringing them to the same temperature.
As you might expect, the increase in the amount of energy spreading and sharing is proportional to the amount of heat transferred q, but there is one other factor involved, and that is the temperature at which the transfer occurs. When a quantity of heat q passes into a system at temperature T, the degree of dilution of the thermal energy is given by
To understand why we have to divide by the temperature, consider the effect of very large and very small values of T in the denominator. If the body receiving the heat is initially at a very low temperature, relatively few thermal energy states are initially occupied, so the amount of energy spreading into vacant states can be very great. Conversely, if the temperature is initially large, more thermal energy is already spread around within it, and absorption of the additional energy will have a relatively small effect on the degree of thermal disorder within the body.
When a chemical reaction takes place, two kinds of changes relating to thermal energy are involved:
Shown below are schematic representations of the translational energy levels of the two components H and H2 of the hydrogen dissociation reaction. The shading shows how the relative populations of occupied microstates vary with the temperature, causing the equilibrium composition to change in favor of the dissociation product.
The ability of energy to spread into the product molecules is constrained by the availability of sufficient thermal energy to produce these molecules. This is where the temperature comes in. At absolute zero the situation is very simple; no thermal enegy is available to bring about dissociation, so the only component present will be dihydrogen.
The result is exactly what the LeChâtelier Principle predicts: the equilibrium state for an endothermic reaction is shifted to the right at higher temperatures.
The following table generalizes these relations for the four sign-combinations of Δ
Exothermic reaction, ΔS > 0
|C(graphite) + O2(g) → CO2(g)
ΔH° = 393 kJ
ΔS° = +2.9 J K1
ΔG° = 394 kJ at 298 K
This combustion reaction, like most such reactions, is spontaneous at all temperatures. The positive entropy change is due mainly to the greater mass of CO2 molecules compared to those of O2.
Exothermic reaction, ΔS < 0
|3 H2 + N2 → 2 NH3(g)
ΔH° = 46.2 kJ
ΔS° = 389 J K1
ΔG° = 16.4 kJ at 298 K
The decrease in moles of gas in the Haber ammonia synthesis drives the entropy change negative, making the reaction spontaneous only at low temperatures. Thus higher T, which speeds up the reaction, also reduces its extent.
Endothermic reaction, ΔS > 0
|N2O4(g) → 2 NO2(g)
ΔH° = 55.3 kJ
ΔS° = +176 J K1
ΔG° = +2.8 kJ at 298 K
Dissociation reactions are typically endothermic with positive entropy change, and are therefore spontaneous at high temperatures. Ultimately, all molecules decompose to their atoms at sufficiently high temperatures.
Endothermic reaction, ΔS < 0
|½ N2 + O2 → NO2(g)
ΔH° = 33.2 kJ
ΔS° = 249 J K1
ΔG° = +51.3 kJ at 298 K
This reaction is not spontaneous at any temperature, meaning that its reverse is always spontaneous. But because the reverse reaction is kinetically inhibited, NO2 can exist indefinitely at ordinary temperatures even though it is thermodynamically unstable.
Everybody knows that the solid is the stable form of a substance at low temperatures, while the gaseous state prevails at high temperatures. Why should this be? The diagram at the right shows that (1) the density of energy states is smallest in the solid and greatest (much, much greater) in the gas, and (2) the ground states of the liquid and gas are offset from that of the previous state by the heats of fusion and vaporization, respectively.
Changes of phase involve exchange of energy with the surroundings (whose energy content relative to the system is indicated (with much exaggeration!) by the height of the yellow vertical bars below. When solid and liquid are in equilibrium (middle section of diagram below), there is sufficient thermal energy (indicated by pink shading) to populate the energy states of both phases. If heat is allowed to flow into the surroundings, it is withdrawn selectively from the more abundantly populated levels of the liquid phase, causing the quantity of this phase to decrease in favor of the solid. The temperature remains constant as the heat of fusion is returned to the system in exact compensation for the heat lost to the surroundings. Finally, after the last trace of liquid has disappeared, the only states remaining are those of the solid. Any further withdrawal of heat results in a temperature drop as the states of the solid become depopulated.
Vapor pressure lowering, boiling point elevation, freezing point depression and osmosis are well-known phenomena that occur when a non-volatile solute such as sugar or a salt is dissolved in a volatile solvent such as water. All these effects result from dilution of the solvent by the added solute, and because of this commonality they are referred to as colligative properties (Lat. co ligare, connected to.) The key role of the solvent concentration is obscured by the greatly-simplified expressions used to calculate the magnitude of these effects, in which only the solute concentration appears. The details of how to carry out these calculations and the many important applications of colligative properties are covered elsewhere. Our purpose here is to offer a more complete explanation of why these phenomena occur.
Basically, these all result from the effect of dilution of the solvent on its entropy, and thus in the increase in the density of energy states of the system in the solution compared to that in the pure liquid. Equilibrium between two phases (liquid-gas for boiling and solid-liquid for freezing) occurs when the energy states in each phase can be populated at equal densities. The temperatures at which this occurs are depicted by the shading.
Dilution of the solvent adds new energy states to the liquid, but does not affect the vapor phase. This raises the temperature required to make equal numbers of microstates accessible in the two phases.
Dilution of the solvent adds new energy states to the liquid, but does not affect the solid phase. This reduces the temperature required to make equal numbers of states accessible in the two phases.
When a liquid is subjected to hydrostatic pressure for example, by an inert, non-dissolving gas that occupies the vapor space above the surface, the vapor pressure of the liquid is raised. The pressure acts to compress the liquid very slightly, effectively narrowing the potential energy well in which the individual molecules reside and thus increasing their tendency to escape from the liquid phase. (Because liquids are not very compressible, the effect is quite small; a 100-atm applied pressure will raise the vapor pressure of water at 25°C by only about 2 torr.) In terms of the entropy, we can say that the applied pressure reduces the dimensions of the "box" within which the principal translational motions of the molecules are confined within the liquid, thus reducing the density of energy states in the liquid phase.
Applying hydrostatic pressure to a liquid increases the spacing of its microstates, so that the number of energetically accessible states in the gas, although unchanged, is relatively greater— thus increasing the tendency of molecules to escape into the vapor phase. In terms of free energy, the higher pressure raises the free energy of the liquid, but does not affect that of the gas phase.
This phenomenon can explain osmotic pressure. Osmotic pressure, students must be reminded, is not what drives osmosis, but is rather the hydrostatic pressure that must be applied to the more concentrated solution (more dilute solvent) in order to stop osmotic flow of solvent into the solution. The effect of this pressure Π is to slightly increase the spacing of solvent energy states on the high-pressure (dilute-solvent) side of the membrane to match that of the pure solvent, restoring osmotic equilibrium.
I wish to thank Frank Lambert for convincing me (and more importantly, many textbook authors) to stop corrupting youth by equating entropy with "disorder". See especially his introductory articles Entropy is Simple, and Teaching Entropy. I also found much inspiration (and ideas for many of my diagrams) in William G. Davies' 1972 paperback Introduction to thermodynamics: a non-calculus approach. Finally, I am grateful to Bob Hanson of St. Olaf College for his helpful comments.
All about entropy... not really; some ideas about order-from-disorder and cognition
Boltzmann's Dream - statistical physics and visualizing phase space by Franz Vesely (mathematical!)
Can gravity decrease entropy? What happens when gases in outer space condense into stars?
Heat engines - HyperPhysics site
Lord Kelvin can conserve you from entropy - praise the Lord!
Maxwellian Demon site - some articles on entropy and Gibb's Paradox
The laws of Thermodnamics - a physicist's view by John Denker
Theological thermodynamics: heaven is hotter than hell. A thoughtful (and somewhat technical) article, and a response to it.
Thermodynamics, Evolution and Creationism - many interesting links relating to the misunderstanding of the Second Law from The Talk.Origins site.
Thermodynamics Primer - sections on entropy, thermal equilibrium, chemical potential
What is a microstate? One of many interesting articles at Frank Lambert's site.
Page last modified: 10.03.2010