Temperature & Entropy: Difference between revisions
No edit summary |
|||
| Line 1: | Line 1: | ||
Claimed by | <p style="font-size:18pt;"><b>Claimed by Ghood6 (Spring 2026)</b></p> | ||
<blockquote> | <blockquote> | ||
| Line 6: | Line 6: | ||
<p>— Stephen Hawking</p> | <p>— Stephen Hawking</p> | ||
Temperature and entropy are two of the most important concepts in thermal physics. '''Temperature''' is a measure of the average kinetic energy associated with the microscopic motion of the particles in a system, and it determines the direction of spontaneous energy transfer between two systems in thermal contact. '''Entropy''' is a quantitative measure of the number of microscopic arrangements (microstates) that correspond to a given macroscopic state of a system. The two are deeply linked: temperature is precisely the inverse of how rapidly the entropy of a system grows when energy is added to it. | |||
What makes entropy so important is its connection to the [https://en.wikipedia.org/wiki/Second_law_of_thermodynamics Second Law of Thermodynamics], which states: | |||
<blockquote> | <blockquote> | ||
<p>The entropy of | <p>The total entropy of an isolated system never decreases over time.</p> | ||
</blockquote> | </blockquote> | ||
Entropy is highly related to [[Application of Statistics in Physics]], since it is a | Entropy is highly related to [[Application of Statistics in Physics]], since it is fundamentally a statistical quantity. The Second Law follows from the simple fact that, when many particles share energy, configurations with more microstates are vastly more probable than configurations with few. On macroscopic scales, the probability of spontaneous entropy decrease is so small that it is effectively never observed. The concept of entropy is famous for its proof against the existence of [https://en.wikipedia.org/wiki/Perpetual_motion perpetual motion machines] of the second kind. | ||
==The Main Idea== | |||
Temperature, energy, and entropy together describe the thermal behavior of any system in equilibrium. Imagine two solid blocks at different temperatures placed in contact. Energy will flow from the hotter block to the cooler one, not because energy "knows" which way to go, but because the combined system has many more ways to arrange its energy when it is shared between the blocks at a common temperature than when it is concentrated in one of them. Temperature emerges naturally as the quantity that equalizes when this process is complete. | |||
The | The microscopic picture used most often in introductory physics is the '''Einstein model of a solid'''. Each atom in the solid is treated as three independent quantum harmonic oscillators (one for each spatial direction), each able to hold energy only in discrete quanta of size <math>q = \hbar \omega_0</math>. The macroscopic properties of the solid — its temperature, its heat capacity, and its entropy — are then computed by counting the number of ways the available energy quanta can be distributed among the oscillators. | ||
===A Mathematical Model=== | |||
The fundamental thermodynamic definition of temperature is | |||
= | <math>\frac{1}{T} = \frac{\partial S}{\partial E}</math> | ||
where ''S'' is the entropy of the system and ''E'' is its internal energy. In words, temperature measures how much the entropy of a system increases per unit of energy added to it. A system that gains a lot of entropy from a small amount of added energy is "cold"; a system whose entropy barely changes when energy is added is "hot." | |||
Boltzmann's statistical definition of entropy is | |||
<math>S = k_B \ln \Omega</math> | |||
where <math>\Omega</math> is the number of microstates consistent with the macrostate of the system, and <math>k_B = 1.38 \times 10^{-23} \text{ J/K}</math> is Boltzmann's constant. | |||
For an Einstein solid containing ''N'' independent quantum oscillators sharing ''q'' indistinguishable energy quanta, the multiplicity is | |||
<math>\Omega(N, q) = \frac{(q + N - 1)!}{q!\,(N - 1)!}</math> | |||
= | For two Einstein solids in thermal contact with ''q''<sub>total</sub> quanta shared between them, the most probable macrostate is the one that maximizes <math>\Omega_1(N_1, q_1)\,\Omega_2(N_2, q_2)</math> subject to <math>q_1 + q_2 = q_{\text{total}}</math>. Setting the derivative with respect to ''q''<sub>1</sub> to zero gives | ||
<math>\frac{\partial S_1}{\partial E_1} = \frac{\partial S_2}{\partial E_2} \quad\Rightarrow\quad T_1 = T_2</math> | |||
which is exactly the condition of thermal equilibrium. This is the microscopic reason temperatures equalize. | |||
The change in entropy for a reversible process is given by the Clausius definition, | |||
= | <math>dS = \frac{dQ_{\text{rev}}}{T}</math> | ||
and integrating between two states gives the macroscopic entropy change. For an ideal gas undergoing a reversible isothermal expansion from volume ''V''<sub>1</sub> to ''V''<sub>2</sub>, this becomes | |||
<math>\Delta S = n R \ln\!\left(\frac{V_2}{V_1}\right)</math> | |||
===A Computational Model=== | |||
A useful way to build intuition for entropy is to simulate two Einstein solids in thermal contact and watch the most probable energy distribution emerge from random exchanges of quanta. The PhET simulation below also illustrates the same principle for a gas: | |||
* [https://phet.colorado.edu/sims/html/gas-properties/latest/gas-properties_en.html PhET Gas Properties Simulation] — drag particles around, change temperature and volume, and watch how the system relaxes toward the most probable (highest-multiplicity) state. | |||
A custom GlowScript / VPython model of two coupled Einstein solids is described in the "Suggested Computational Model" section below; it should be embedded here once written. | |||
==Examples== | |||
===Simple=== | |||
'''Question:''' A small Einstein solid has ''N'' = 3 oscillators and ''q'' = 4 quanta of energy. How many microstates does the system have, and what is its entropy? | |||
'''Solution:''' | |||
Using the multiplicity formula, | |||
<math>\Omega = \frac{(q + N - 1)!}{q!\,(N - 1)!} = \frac{(4 + 3 - 1)!}{4!\,(3 - 1)!} = \frac{6!}{4!\,2!} = \frac{720}{24 \cdot 2} = 15</math> | |||
The entropy is | |||
<math>S = k_B \ln \Omega = (1.38 \times 10^{-23}\,\text{J/K})\,\ln(15) \approx 3.74 \times 10^{-23}\,\text{J/K}</math> | |||
This is tiny because we only have three oscillators. Real solids have <math>\sim 10^{23}</math> oscillators, which is why measurable entropies are on the order of joules per kelvin. | |||
===Middling=== | |||
= | '''Question:''' During a phase transition, a material's temperature does not increase even as energy is added — for example, while ice melts at 0 °C. Explain how this is consistent with the relation <math>1/T = \partial S / \partial E</math>. | ||
'''Solution:''' | |||
During the transition, energy enters the system as latent heat, and the entropy increases because the liquid phase has many more accessible microstates than the solid phase (molecules are no longer locked into a lattice). Both ''S'' and ''E'' increase, but they increase together at a constant ratio set by the transition temperature, so | |||
<math>\frac{\partial S}{\partial E} = \frac{1}{T_{\text{melt}}} = \text{constant}</math> | |||
The temperature stays fixed at the melting point until the entire phase transition is complete. The latent heat for melting ice, for instance, is | |||
<math>L_f = 334 \text{ J/g}, \qquad \Delta S = \frac{L_f}{T} = \frac{334}{273.15} \approx 1.22 \text{ J/(g·K)}</math> | |||
===Difficult=== | |||
'''Question:''' Two Einstein solids ''A'' and ''B'' are in thermal contact. Solid ''A'' has <math>N_A = 300</math> oscillators and solid ''B'' has <math>N_B = 200</math> oscillators. They share <math>q_{\text{total}} = 100</math> quanta. Find the most probable value of <math>q_A</math>, and show that at this point the two solids have the same temperature. | |||
'''Solution:''' | |||
The total multiplicity is | |||
<math>\Omega_{\text{total}}(q_A) = \Omega_A(N_A, q_A)\,\Omega_B(N_B, q_{\text{total}} - q_A)</math> | |||
The most probable macrostate is the one that maximizes <math>\ln \Omega_{\text{total}}</math>. Taking the derivative with respect to <math>q_A</math> and using Stirling's approximation, | |||
<math>\frac{\partial \ln \Omega_A}{\partial q_A} = \frac{\partial \ln \Omega_B}{\partial q_B}</math> | |||
For large ''N'' and ''q'' the Einstein solid satisfies <math>\partial \ln \Omega / \partial q \approx \ln\!\big((q + N)/q\big)</math>, so the equilibrium condition is | |||
<math>\frac{q_A + N_A}{q_A} = \frac{q_B + N_B}{q_B}</math> | |||
Solving with <math>q_A + q_B = 100</math>, <math>N_A = 300</math>, <math>N_B = 200</math> gives | |||
<math>q_A = q_{\text{total}} \cdot \frac{N_A}{N_A + N_B} = 100 \cdot \frac{300}{500} = 60, \qquad q_B = 40</math> | |||
The energy per oscillator is the same in both solids (<math>q_A/N_A = q_B/N_B = 0.2</math>), and since temperature in the Einstein model is a monotonic function of energy per oscillator, the two temperatures are equal — exactly as expected for thermal equilibrium. | |||
==Suggested Computational Model== | |||
plot | Below is a description of a computational model that should be implemented in [https://www.glowscript.org/ GlowScript / VPython] and embedded into this page using a Trinket. The intended behavior is to simulate two Einstein solids exchanging energy quanta and to plot the resulting energy distribution alongside the theoretical multiplicity curve. | ||
'''Goal:''' Visually demonstrate that two Einstein solids in thermal contact spontaneously evolve toward the macrostate of maximum total multiplicity (i.e., thermal equilibrium), and that the equilibrium distribution matches the theoretical prediction <math>q_A/N_A = q_B/N_B</math>. | |||
'''Inputs (constants the user can change at the top of the script):''' | |||
* <code>N_A</code> — number of oscillators in solid A (e.g., 300) | |||
* <code>N_B</code> — number of oscillators in solid B (e.g., 200) | |||
* <code>q_total</code> — total energy quanta to distribute (e.g., 100) | |||
* <code>n_steps</code> — number of Monte Carlo exchange steps (e.g., 50000) | |||
* <code>initial_split</code> — fraction of quanta initially placed in solid A (e.g., 1.0, meaning all quanta start in A) | |||
'''Algorithm:''' | |||
# Create two arrays of length <code>N_A</code> and <code>N_B</code>, each entry representing the number of quanta on that oscillator. Distribute <code>q_total * initial_split</code> quanta randomly among the oscillators of solid A and the rest among solid B. | |||
# At each Monte Carlo step: | |||
## Pick a random oscillator in either solid that currently has at least one quantum. | |||
## Pick a second random oscillator anywhere in either solid. | |||
## Move one quantum from the first to the second. (This conserves <code>q_total</code> and respects the assumption that all microstates are equally likely.) | |||
# Every <code>k</code> steps, record <code>q_A</code> (the total quanta currently in solid A) and update a live histogram of visited <code>q_A</code> values. | |||
# In parallel, plot the theoretical curve <math>\Omega_A(N_A, q_A) \Omega_B(N_B, q_{\text{total}} - q_A)</math> normalized to the same area as the histogram. | |||
'''Visualization:''' | |||
* Two side-by-side rectangular blocks rendered with VPython <code>box</code> objects, each colored on a heat-map scale according to its current energy per oscillator (blue = cold, red = hot). | |||
* A live time series at the top showing <code>q_A(t)</code> approaching the equilibrium value <math>q_A^* = q_{\text{total}} N_A / (N_A + N_B)</math>. | |||
* A live histogram at the bottom of visited <code>q_A</code> values, with the theoretical multiplicity curve overlaid as a smooth line. | |||
'''Expected result:''' Even when all quanta start in solid A, within a few thousand steps the system relaxes to a sharply peaked distribution centered on <math>q_A^* = q_{\text{total}} N_A / (N_A + N_B)</math>, and the histogram matches the theoretical multiplicity curve. This is a direct visualization of the Second Law: the system finds the macrostate of maximum multiplicity. | |||
'''Embedding:''' Once the model is written on glowscript.org and converted to a Trinket, embed it on this page using the Trinket iframe snippet, e.g. | |||
<code><iframe src="https://trinket.io/embed/glowscript/XXXXXXXX" width="100%" height="600" frameborder="0" marginwidth="0" marginheight="0" allowfullscreen></iframe></code> | |||
==Connectedness== | ==Connectedness== | ||
I am | '''How is this topic connected to something I am interested in?''' | ||
Entropy connects directly to information theory and machine learning. Claude Shannon's information entropy <math>H = -\sum p_i \log p_i</math> is mathematically identical (up to a constant) to Boltzmann's thermodynamic entropy. Modern machine learning models are routinely trained by minimizing cross-entropy, and the same statistical reasoning that explains why heat flows from hot to cold also explains why a well-trained model converges to the most probable explanation of the data. | |||
'''How is it connected to my major?''' | |||
For engineering majors, the second law sets a hard upper limit on the efficiency of every heat engine and every refrigerator: the [https://en.wikipedia.org/wiki/Carnot_cycle Carnot efficiency] <math>\eta = 1 - T_C/T_H</math>. No amount of cleverness in design can beat this bound. For computer scientists, Landauer's principle states that erasing a single bit of information in an environment at temperature ''T'' must dissipate at least <math>k_B T \ln 2</math> of heat — a thermodynamic cost on computation itself. | |||
'''Industrial applications:''' | |||
Entropy and temperature are central to power generation (steam turbines, gas turbines, nuclear reactors), refrigeration and air conditioning, materials processing, chemical engineering (free energy and reaction spontaneity), and the design of low-power electronics where thermal management is the dominant constraint. | |||
==History== | |||
The concepts of temperature, entropy, and energy have been intertwined throughout the history of physics. Early notions of heat described it as a fluid (the "caloric"), and even Sir Isaac Newton speculated that heat might have mass. In 1824, the French engineer '''Sadi Carnot''' published ''Reflections on the Motive Power of Fire'', which established the upper limit on the efficiency of any heat engine — the first quantitative statement of the second law, written before entropy itself had been named. | |||
In the 1850s, '''Rudolf Clausius''' formalized the idea that thermodynamic processes irreversibly degrade energy. In 1865 he coined the word '''entropy''' from the Greek τροπή ("transformation"), explaining: | |||
= | <blockquote> | ||
<p>I prefer going to the ancient languages for the names of important scientific quantities, so that they may mean the same thing in all living tongues. I propose, therefore, to call ''S'' the entropy of a body, after the Greek word "transformation". I have designedly coined the word entropy to be similar to energy, for these two quantities are so analogous in their physical significance that an analogy of denominations seems to me helpful.</p> | |||
</blockquote> | |||
In the 1870s and 1880s, '''Ludwig Boltzmann''' established the microscopic foundation of entropy by connecting it to the number of accessible microstates, <math>S = k_B \ln \Omega</math> — the equation engraved on his tombstone in Vienna. '''Josiah Willard Gibbs''' then extended the framework into the modern statistical mechanics used today, treating entropy as a property of probability distributions over microstates. | |||
The | The 20th century added two more layers. In 1948, '''Claude Shannon''' showed that the same mathematical form governs information. In the 1970s, '''Jacob Bekenstein''' and '''Stephen Hawking''' showed that black holes themselves carry entropy proportional to their horizon area, tying thermodynamics to gravitation and quantum mechanics. | ||
==Notable Scientists== | ==Notable Scientists== | ||
== See also == | * Sadi Carnot | ||
* Rudolf Clausius | |||
* James Prescott Joule | |||
* Ludwig Boltzmann | |||
* Josiah Willard Gibbs | |||
* Claude Shannon | |||
==See also== | |||
All thermodynamics is linked to the concept of Energy. See also Heat, Temperature, Statistical Mechanics, Spontaneity. For | All thermodynamics is linked to the concept of [[Energy]]. See also [[Heat]], [[Temperature]], [[Statistical Mechanics]], [[Spontaneity]]. For an interesting cosmological perspective, see the [https://en.wikipedia.org/wiki/Heat_death_of_the_universe Heat Death of the Universe]. | ||
===Further reading=== | ===Further reading=== | ||
*[[Application of Statistics in Physics]] | * [[Application of Statistics in Physics]] | ||
*[[Quantized energy levels]] | * [[Quantized energy levels]] | ||
*[[Electronic Energy Levels and Photons]] | * [[Electronic Energy Levels and Photons]] | ||
* Schroeder, D. V. ''An Introduction to Thermal Physics'' (Addison-Wesley, 2000) — chapters 2 and 3 give the Einstein-solid derivation used above. | |||
* Reif, F. ''Fundamentals of Statistical and Thermal Physics'' (McGraw-Hill, 1965). | |||
===External links=== | ===External links=== | ||
*[https://en.wikipedia.org/wiki/ | * [https://en.wikipedia.org/wiki/Entropy Wikipedia: Entropy] | ||
*[https://en.wikipedia.org/wiki/ | * [https://en.wikipedia.org/wiki/Temperature Wikipedia: Temperature] | ||
*[https://en.wikipedia.org/wiki/ | * [https://en.wikipedia.org/wiki/Second_law_of_thermodynamics Wikipedia: Second Law of Thermodynamics] | ||
*[https:// | * [https://en.wikipedia.org/wiki/Einstein_solid Wikipedia: Einstein Solid] | ||
* [https://phet.colorado.edu/sims/html/gas-properties/latest/gas-properties_en.html PhET: Gas Properties] | |||
* [https://www.feynmanlectures.caltech.edu/I_44.html Feynman Lectures, Vol. I, Ch. 44 — The Laws of Thermodynamics] | |||
==References== | ==References== | ||
# "Entropy." ''Wikipedia'', Wikimedia Foundation, https://en.wikipedia.org/wiki/Entropy | |||
# "Temperature." ''Wikipedia'', Wikimedia Foundation, https://en.wikipedia.org/wiki/Temperature | |||
# "Einstein solid." ''Wikipedia'', Wikimedia Foundation, https://en.wikipedia.org/wiki/Einstein_solid | |||
# Schroeder, Daniel V. ''An Introduction to Thermal Physics''. Addison-Wesley, 2000. | |||
# Carnot, Sadi. ''Reflections on the Motive Power of Fire''. 1824. Public-domain English translation: https://en.wikisource.org/wiki/Reflections_on_the_Motive_Power_of_Heat | |||
# Bekenstein, J. D. "Black Holes and Entropy." ''Physical Review D'' 7, 2333 (1973). | |||
[[Category:Statistical Physics]] | [[Category:Statistical Physics]] | ||
Revision as of 19:48, 26 April 2026
Claimed by Ghood6 (Spring 2026)
The increase of ... entropy is what distinguishes the past from the future, giving a direction to time.
— Stephen Hawking
Temperature and entropy are two of the most important concepts in thermal physics. Temperature is a measure of the average kinetic energy associated with the microscopic motion of the particles in a system, and it determines the direction of spontaneous energy transfer between two systems in thermal contact. Entropy is a quantitative measure of the number of microscopic arrangements (microstates) that correspond to a given macroscopic state of a system. The two are deeply linked: temperature is precisely the inverse of how rapidly the entropy of a system grows when energy is added to it.
What makes entropy so important is its connection to the Second Law of Thermodynamics, which states:
The total entropy of an isolated system never decreases over time.
Entropy is highly related to Application of Statistics in Physics, since it is fundamentally a statistical quantity. The Second Law follows from the simple fact that, when many particles share energy, configurations with more microstates are vastly more probable than configurations with few. On macroscopic scales, the probability of spontaneous entropy decrease is so small that it is effectively never observed. The concept of entropy is famous for its proof against the existence of perpetual motion machines of the second kind.
The Main Idea
Temperature, energy, and entropy together describe the thermal behavior of any system in equilibrium. Imagine two solid blocks at different temperatures placed in contact. Energy will flow from the hotter block to the cooler one, not because energy "knows" which way to go, but because the combined system has many more ways to arrange its energy when it is shared between the blocks at a common temperature than when it is concentrated in one of them. Temperature emerges naturally as the quantity that equalizes when this process is complete.
The microscopic picture used most often in introductory physics is the Einstein model of a solid. Each atom in the solid is treated as three independent quantum harmonic oscillators (one for each spatial direction), each able to hold energy only in discrete quanta of size [math]\displaystyle{ q = \hbar \omega_0 }[/math]. The macroscopic properties of the solid — its temperature, its heat capacity, and its entropy — are then computed by counting the number of ways the available energy quanta can be distributed among the oscillators.
A Mathematical Model
The fundamental thermodynamic definition of temperature is
[math]\displaystyle{ \frac{1}{T} = \frac{\partial S}{\partial E} }[/math]
where S is the entropy of the system and E is its internal energy. In words, temperature measures how much the entropy of a system increases per unit of energy added to it. A system that gains a lot of entropy from a small amount of added energy is "cold"; a system whose entropy barely changes when energy is added is "hot."
Boltzmann's statistical definition of entropy is
[math]\displaystyle{ S = k_B \ln \Omega }[/math]
where [math]\displaystyle{ \Omega }[/math] is the number of microstates consistent with the macrostate of the system, and [math]\displaystyle{ k_B = 1.38 \times 10^{-23} \text{ J/K} }[/math] is Boltzmann's constant.
For an Einstein solid containing N independent quantum oscillators sharing q indistinguishable energy quanta, the multiplicity is
[math]\displaystyle{ \Omega(N, q) = \frac{(q + N - 1)!}{q!\,(N - 1)!} }[/math]
For two Einstein solids in thermal contact with qtotal quanta shared between them, the most probable macrostate is the one that maximizes [math]\displaystyle{ \Omega_1(N_1, q_1)\,\Omega_2(N_2, q_2) }[/math] subject to [math]\displaystyle{ q_1 + q_2 = q_{\text{total}} }[/math]. Setting the derivative with respect to q1 to zero gives
[math]\displaystyle{ \frac{\partial S_1}{\partial E_1} = \frac{\partial S_2}{\partial E_2} \quad\Rightarrow\quad T_1 = T_2 }[/math]
which is exactly the condition of thermal equilibrium. This is the microscopic reason temperatures equalize.
The change in entropy for a reversible process is given by the Clausius definition,
[math]\displaystyle{ dS = \frac{dQ_{\text{rev}}}{T} }[/math]
and integrating between two states gives the macroscopic entropy change. For an ideal gas undergoing a reversible isothermal expansion from volume V1 to V2, this becomes
[math]\displaystyle{ \Delta S = n R \ln\!\left(\frac{V_2}{V_1}\right) }[/math]
A Computational Model
A useful way to build intuition for entropy is to simulate two Einstein solids in thermal contact and watch the most probable energy distribution emerge from random exchanges of quanta. The PhET simulation below also illustrates the same principle for a gas:
- PhET Gas Properties Simulation — drag particles around, change temperature and volume, and watch how the system relaxes toward the most probable (highest-multiplicity) state.
A custom GlowScript / VPython model of two coupled Einstein solids is described in the "Suggested Computational Model" section below; it should be embedded here once written.
Examples
Simple
Question: A small Einstein solid has N = 3 oscillators and q = 4 quanta of energy. How many microstates does the system have, and what is its entropy?
Solution: Using the multiplicity formula,
[math]\displaystyle{ \Omega = \frac{(q + N - 1)!}{q!\,(N - 1)!} = \frac{(4 + 3 - 1)!}{4!\,(3 - 1)!} = \frac{6!}{4!\,2!} = \frac{720}{24 \cdot 2} = 15 }[/math]
The entropy is
[math]\displaystyle{ S = k_B \ln \Omega = (1.38 \times 10^{-23}\,\text{J/K})\,\ln(15) \approx 3.74 \times 10^{-23}\,\text{J/K} }[/math]
This is tiny because we only have three oscillators. Real solids have [math]\displaystyle{ \sim 10^{23} }[/math] oscillators, which is why measurable entropies are on the order of joules per kelvin.
Middling
Question: During a phase transition, a material's temperature does not increase even as energy is added — for example, while ice melts at 0 °C. Explain how this is consistent with the relation [math]\displaystyle{ 1/T = \partial S / \partial E }[/math].
Solution: During the transition, energy enters the system as latent heat, and the entropy increases because the liquid phase has many more accessible microstates than the solid phase (molecules are no longer locked into a lattice). Both S and E increase, but they increase together at a constant ratio set by the transition temperature, so
[math]\displaystyle{ \frac{\partial S}{\partial E} = \frac{1}{T_{\text{melt}}} = \text{constant} }[/math]
The temperature stays fixed at the melting point until the entire phase transition is complete. The latent heat for melting ice, for instance, is
[math]\displaystyle{ L_f = 334 \text{ J/g}, \qquad \Delta S = \frac{L_f}{T} = \frac{334}{273.15} \approx 1.22 \text{ J/(g·K)} }[/math]
Difficult
Question: Two Einstein solids A and B are in thermal contact. Solid A has [math]\displaystyle{ N_A = 300 }[/math] oscillators and solid B has [math]\displaystyle{ N_B = 200 }[/math] oscillators. They share [math]\displaystyle{ q_{\text{total}} = 100 }[/math] quanta. Find the most probable value of [math]\displaystyle{ q_A }[/math], and show that at this point the two solids have the same temperature.
Solution: The total multiplicity is
[math]\displaystyle{ \Omega_{\text{total}}(q_A) = \Omega_A(N_A, q_A)\,\Omega_B(N_B, q_{\text{total}} - q_A) }[/math]
The most probable macrostate is the one that maximizes [math]\displaystyle{ \ln \Omega_{\text{total}} }[/math]. Taking the derivative with respect to [math]\displaystyle{ q_A }[/math] and using Stirling's approximation,
[math]\displaystyle{ \frac{\partial \ln \Omega_A}{\partial q_A} = \frac{\partial \ln \Omega_B}{\partial q_B} }[/math]
For large N and q the Einstein solid satisfies [math]\displaystyle{ \partial \ln \Omega / \partial q \approx \ln\!\big((q + N)/q\big) }[/math], so the equilibrium condition is
[math]\displaystyle{ \frac{q_A + N_A}{q_A} = \frac{q_B + N_B}{q_B} }[/math]
Solving with [math]\displaystyle{ q_A + q_B = 100 }[/math], [math]\displaystyle{ N_A = 300 }[/math], [math]\displaystyle{ N_B = 200 }[/math] gives
[math]\displaystyle{ q_A = q_{\text{total}} \cdot \frac{N_A}{N_A + N_B} = 100 \cdot \frac{300}{500} = 60, \qquad q_B = 40 }[/math]
The energy per oscillator is the same in both solids ([math]\displaystyle{ q_A/N_A = q_B/N_B = 0.2 }[/math]), and since temperature in the Einstein model is a monotonic function of energy per oscillator, the two temperatures are equal — exactly as expected for thermal equilibrium.
Suggested Computational Model
Below is a description of a computational model that should be implemented in GlowScript / VPython and embedded into this page using a Trinket. The intended behavior is to simulate two Einstein solids exchanging energy quanta and to plot the resulting energy distribution alongside the theoretical multiplicity curve.
Goal: Visually demonstrate that two Einstein solids in thermal contact spontaneously evolve toward the macrostate of maximum total multiplicity (i.e., thermal equilibrium), and that the equilibrium distribution matches the theoretical prediction [math]\displaystyle{ q_A/N_A = q_B/N_B }[/math].
Inputs (constants the user can change at the top of the script):
N_A— number of oscillators in solid A (e.g., 300)N_B— number of oscillators in solid B (e.g., 200)q_total— total energy quanta to distribute (e.g., 100)n_steps— number of Monte Carlo exchange steps (e.g., 50000)initial_split— fraction of quanta initially placed in solid A (e.g., 1.0, meaning all quanta start in A)
Algorithm:
- Create two arrays of length
N_AandN_B, each entry representing the number of quanta on that oscillator. Distributeq_total * initial_splitquanta randomly among the oscillators of solid A and the rest among solid B. - At each Monte Carlo step:
- Pick a random oscillator in either solid that currently has at least one quantum.
- Pick a second random oscillator anywhere in either solid.
- Move one quantum from the first to the second. (This conserves
q_totaland respects the assumption that all microstates are equally likely.)
- Every
ksteps, recordq_A(the total quanta currently in solid A) and update a live histogram of visitedq_Avalues. - In parallel, plot the theoretical curve [math]\displaystyle{ \Omega_A(N_A, q_A) \Omega_B(N_B, q_{\text{total}} - q_A) }[/math] normalized to the same area as the histogram.
Visualization:
- Two side-by-side rectangular blocks rendered with VPython
boxobjects, each colored on a heat-map scale according to its current energy per oscillator (blue = cold, red = hot). - A live time series at the top showing
q_A(t)approaching the equilibrium value [math]\displaystyle{ q_A^* = q_{\text{total}} N_A / (N_A + N_B) }[/math]. - A live histogram at the bottom of visited
q_Avalues, with the theoretical multiplicity curve overlaid as a smooth line.
Expected result: Even when all quanta start in solid A, within a few thousand steps the system relaxes to a sharply peaked distribution centered on [math]\displaystyle{ q_A^* = q_{\text{total}} N_A / (N_A + N_B) }[/math], and the histogram matches the theoretical multiplicity curve. This is a direct visualization of the Second Law: the system finds the macrostate of maximum multiplicity.
Embedding: Once the model is written on glowscript.org and converted to a Trinket, embed it on this page using the Trinket iframe snippet, e.g.
<iframe src="https://trinket.io/embed/glowscript/XXXXXXXX" width="100%" height="600" frameborder="0" marginwidth="0" marginheight="0" allowfullscreen></iframe>
Connectedness
How is this topic connected to something I am interested in? Entropy connects directly to information theory and machine learning. Claude Shannon's information entropy [math]\displaystyle{ H = -\sum p_i \log p_i }[/math] is mathematically identical (up to a constant) to Boltzmann's thermodynamic entropy. Modern machine learning models are routinely trained by minimizing cross-entropy, and the same statistical reasoning that explains why heat flows from hot to cold also explains why a well-trained model converges to the most probable explanation of the data.
How is it connected to my major? For engineering majors, the second law sets a hard upper limit on the efficiency of every heat engine and every refrigerator: the Carnot efficiency [math]\displaystyle{ \eta = 1 - T_C/T_H }[/math]. No amount of cleverness in design can beat this bound. For computer scientists, Landauer's principle states that erasing a single bit of information in an environment at temperature T must dissipate at least [math]\displaystyle{ k_B T \ln 2 }[/math] of heat — a thermodynamic cost on computation itself.
Industrial applications: Entropy and temperature are central to power generation (steam turbines, gas turbines, nuclear reactors), refrigeration and air conditioning, materials processing, chemical engineering (free energy and reaction spontaneity), and the design of low-power electronics where thermal management is the dominant constraint.
History
The concepts of temperature, entropy, and energy have been intertwined throughout the history of physics. Early notions of heat described it as a fluid (the "caloric"), and even Sir Isaac Newton speculated that heat might have mass. In 1824, the French engineer Sadi Carnot published Reflections on the Motive Power of Fire, which established the upper limit on the efficiency of any heat engine — the first quantitative statement of the second law, written before entropy itself had been named.
In the 1850s, Rudolf Clausius formalized the idea that thermodynamic processes irreversibly degrade energy. In 1865 he coined the word entropy from the Greek τροπή ("transformation"), explaining:
I prefer going to the ancient languages for the names of important scientific quantities, so that they may mean the same thing in all living tongues. I propose, therefore, to call S the entropy of a body, after the Greek word "transformation". I have designedly coined the word entropy to be similar to energy, for these two quantities are so analogous in their physical significance that an analogy of denominations seems to me helpful.
In the 1870s and 1880s, Ludwig Boltzmann established the microscopic foundation of entropy by connecting it to the number of accessible microstates, [math]\displaystyle{ S = k_B \ln \Omega }[/math] — the equation engraved on his tombstone in Vienna. Josiah Willard Gibbs then extended the framework into the modern statistical mechanics used today, treating entropy as a property of probability distributions over microstates.
The 20th century added two more layers. In 1948, Claude Shannon showed that the same mathematical form governs information. In the 1970s, Jacob Bekenstein and Stephen Hawking showed that black holes themselves carry entropy proportional to their horizon area, tying thermodynamics to gravitation and quantum mechanics.
Notable Scientists
- Sadi Carnot
- Rudolf Clausius
- James Prescott Joule
- Ludwig Boltzmann
- Josiah Willard Gibbs
- Claude Shannon
See also
All thermodynamics is linked to the concept of Energy. See also Heat, Temperature, Statistical Mechanics, Spontaneity. For an interesting cosmological perspective, see the Heat Death of the Universe.
Further reading
- Application of Statistics in Physics
- Quantized energy levels
- Electronic Energy Levels and Photons
- Schroeder, D. V. An Introduction to Thermal Physics (Addison-Wesley, 2000) — chapters 2 and 3 give the Einstein-solid derivation used above.
- Reif, F. Fundamentals of Statistical and Thermal Physics (McGraw-Hill, 1965).
External links
- Wikipedia: Entropy
- Wikipedia: Temperature
- Wikipedia: Second Law of Thermodynamics
- Wikipedia: Einstein Solid
- PhET: Gas Properties
- Feynman Lectures, Vol. I, Ch. 44 — The Laws of Thermodynamics
References
- "Entropy." Wikipedia, Wikimedia Foundation, https://en.wikipedia.org/wiki/Entropy
- "Temperature." Wikipedia, Wikimedia Foundation, https://en.wikipedia.org/wiki/Temperature
- "Einstein solid." Wikipedia, Wikimedia Foundation, https://en.wikipedia.org/wiki/Einstein_solid
- Schroeder, Daniel V. An Introduction to Thermal Physics. Addison-Wesley, 2000.
- Carnot, Sadi. Reflections on the Motive Power of Fire. 1824. Public-domain English translation: https://en.wikisource.org/wiki/Reflections_on_the_Motive_Power_of_Heat
- Bekenstein, J. D. "Black Holes and Entropy." Physical Review D 7, 2333 (1973).