Temperature & Entropy: Difference between revisions

From Physics Book
Jump to navigation Jump to search
MaxwellH (talk | contribs)
Ghood6 (talk | contribs)
No edit summary
Line 1: Line 1:
Claimed by Josh Brandt (April 19th, Spring 2022)
<p style="font-size:18pt;"><b>Claimed by Ghood6 (Spring 2026)</b></p>


<blockquote>
<blockquote>
Line 6: Line 6:
<p>— Stephen Hawking</p>
<p>— Stephen Hawking</p>


Entropy is a fundamental characteristic of a system: highly related to the topic of Energy. It is often described as a quantitative measure of disorder. Its formal definition makes it a measure of how many ways there is to distribute energy into a system. What makes Entropy so important is its connection to the Second Law of Thermodynamics, which states
Temperature and entropy are two of the most important concepts in thermal physics. '''Temperature''' is a measure of the average kinetic energy associated with the microscopic motion of the particles in a system, and it determines the direction of spontaneous energy transfer between two systems in thermal contact. '''Entropy''' is a quantitative measure of the number of microscopic arrangements (microstates) that correspond to a given macroscopic state of a system. The two are deeply linked: temperature is precisely the inverse of how rapidly the entropy of a system grows when energy is added to it.
 
What makes entropy so important is its connection to the [https://en.wikipedia.org/wiki/Second_law_of_thermodynamics Second Law of Thermodynamics], which states:


<blockquote>
<blockquote>
  <p>The entropy of a system never decreases spontaneously.</p>
  <p>The total entropy of an isolated system never decreases over time.</p>
</blockquote>
</blockquote>


Entropy is highly related to [[Application of Statistics in Physics]], since it is a result of probability. The Second Law of Thermodynamics rests upon the fact that it is always more likely for entropy to increase, since it is more likely for an outcome to occur if there are more ways which it can. On macroscopic scales, the chances of entropy decrease as non-zero, but are so small as to never be observed.
Entropy is highly related to [[Application of Statistics in Physics]], since it is fundamentally a statistical quantity. The Second Law follows from the simple fact that, when many particles share energy, configurations with more microstates are vastly more probable than configurations with few. On macroscopic scales, the probability of spontaneous entropy decrease is so small that it is effectively never observed. The concept of entropy is famous for its proof against the existence of [https://en.wikipedia.org/wiki/Perpetual_motion perpetual motion machines] of the second kind.
 
==The Main Idea==
 
Temperature, energy, and entropy together describe the thermal behavior of any system in equilibrium. Imagine two solid blocks at different temperatures placed in contact. Energy will flow from the hotter block to the cooler one, not because energy "knows" which way to go, but because the combined system has many more ways to arrange its energy when it is shared between the blocks at a common temperature than when it is concentrated in one of them. Temperature emerges naturally as the quantity that equalizes when this process is complete.


The concept of Entropy is famous for its proof against the existence of Perpetual Motion Machines; disqualifying their ability to maintain order. The concept has a long and ambiguous history, but in recent science it has taken on a formal and integral definition.
The microscopic picture used most often in introductory physics is the '''Einstein model of a solid'''. Each atom in the solid is treated as three independent quantum harmonic oscillators (one for each spatial direction), each able to hold energy only in discrete quanta of size <math>q = \hbar \omega_0</math>. The macroscopic properties of the solid — its temperature, its heat capacity, and its entropy — are then computed by counting the number of ways the available energy quanta can be distributed among the oscillators.


===A Mathematical Model===


The fundamental thermodynamic definition of temperature is


==The Main Idea==
<math>\frac{1}{T} = \frac{\partial S}{\partial E}</math>


===A Mathematical Model===
where ''S'' is the entropy of the system and ''E'' is its internal energy. In words, temperature measures how much the entropy of a system increases per unit of energy added to it. A system that gains a lot of entropy from a small amount of added energy is "cold"; a system whose entropy barely changes when energy is added is "hot."


The fundamental relationship between Temperature <math>T</math>, Energy <math>E</math> and Entropy <math> S \equiv k_B \ln\Omega</math> is <math>\frac{dS}{dE}=\frac{1}{T} </math>.
Boltzmann's statistical definition of entropy is


In order to understand and predict the behavior of solids, we can employ the Einstein Model. This simple model treats interatomic Coulombic force as a spring, justified by the often highly parabolic potential energy curve near the equilibrium in models of interatomic attraction (see Morse Potential, Buckingham Potential, and Lennard-Jones potential). In this way, one can imagine a cubic lattice of spring-interconnected atoms, with an imaginary cube centered around each one slicing each and every spring in half.
<math>S = k_B \ln \Omega</math>


A quantum mechanical harmonic oscillator has quantized energy states, with one quanta being a unit of energy <math>q=\hbar \omega_0 </math>. These quanta can be distributed among a solid by going into anyone of the three half-springs attached to each atom. In fact, the number of ways to distribute <math>q</math> quanta of energy between <math>N</math> oscillators is <math>\Omega = \frac{(q+N-1)!}{q!(N-1)!} </math>. From this, it is clear that entropy is intrinsically related to the number of ways to distribute energy in a system.
where <math>\Omega</math> is the number of microstates consistent with the macrostate of the system, and <math>k_B = 1.38 \times 10^{-23} \text{ J/K}</math> is Boltzmann's constant.


From here, it is almost intuitive that systems should evolve to increased entropy over time. Following the fundamental postulate of thermodynamics: in the long-term steady state, all microstates have equal probability, we can see that a macrostate which includes more microstates is the most likely endpoint for a sufficiently time-evolved system. Having many microstates implies having many different ways to distribute energy, and thus high entropy. It should make sense that the Second Law of Thermodynamics states that "the entropy of an isolated system tends to increase through time".
For an Einstein solid containing ''N'' independent quantum oscillators sharing ''q'' indistinguishable energy quanta, the multiplicity is


The treatment of a large number of particle is the study of Statistical Mechanics, whose applications to Physics are discussed in another page ([[Application of Statistics in Physics]]). It is apparent that Thermodynamics and Statistical Mechanics have become linked as the atomic behavior behind thermodynamic properties was discovered.
<math>\Omega(N, q) = \frac{(q + N - 1)!}{q!\,(N - 1)!}</math>


===A Computational Model===
For two Einstein solids in thermal contact with ''q''<sub>total</sub> quanta shared between them, the most probable macrostate is the one that maximizes <math>\Omega_1(N_1, q_1)\,\Omega_2(N_2, q_2)</math> subject to <math>q_1 + q_2 = q_{\text{total}}</math>. Setting the derivative with respect to ''q''<sub>1</sub> to zero gives


Entropy can be understood by watching the diffusion of gas in the following simulation:
<math>\frac{\partial S_1}{\partial E_1} = \frac{\partial S_2}{\partial E_2} \quad\Rightarrow\quad T_1 = T_2</math>


https://phet.colorado.edu/sims/html/gas-properties/latest/gas-properties_en.html
which is exactly the condition of thermal equilibrium. This is the microscopic reason temperatures equalize.


This simulation connects two essential interpretations of Entropy. Firstly, notice that over time, although the gases begin separated, they end completely mixed. One could easily say that the disorder of the system increased over time, meaning the Entropy increased. Another way to look at it is from the perspective of statistics. There are more configurations where the particles can be disordered than ordered: there's only a couple ways all of the particles can be packed in to one corner, but uncountable ways they can be randomly dispersed. So left to its own, it is exceedingly likely to observe the particles reach a steady-state of perfect diffusion: where they are as mixed as possible. There's simply more ways for this to happen: it is the overwhelmingly likely outcome.
The change in entropy for a reversible process is given by the Clausius definition,


==Examples==
<math>dS = \frac{dQ_{\text{rev}}}{T}</math>


===Simple===
and integrating between two states gives the macroscopic entropy change. For an ideal gas undergoing a reversible isothermal expansion from volume ''V''<sub>1</sub> to ''V''<sub>2</sub>, this becomes


Question: Given that entropy is commonly applied to objects at everyday scales, explain why Boltzmann's definition of entropy is appropriate.
<math>\Delta S = n R \ln\!\left(\frac{V_2}{V_1}\right)</math>


Solution: A pencil has a mass <math>\mathcal{O}(1 \text{ gram})</math>. Pencils are made of wood and graphite, so let's say a pencil is 100% Carbon. Carbon's molar mass is <math>12 \frac{\text{gram}}{\text{mole}} </math>. So there's <math>\frac{1}{12}</math> mole of Carbon atoms in a pencil. Thats <math>\frac{6 \times 10^{23}}{12}</math> Carbon atoms. For each atom, there are three oscillating springs under the Einstein Model, so there is <math>\mathcal{O}(10^{23})</math> oscillators, <math>N</math>. Obviously, <math>\Omega</math> will quickly become unmanageably large. Taking <math>\ln{10^{23}}</math> we get only <math>52</math>. To give appropriate units, we need a constant to multiply, which <math> k_B </math> gives. It should make sense why working with entropy on everyday scales is manageable under this definition.
===A Computational Model===


===Middling===
A useful way to build intuition for entropy is to simulate two Einstein solids in thermal contact and watch the most probable energy distribution emerge from random exchanges of quanta. The PhET simulation below also illustrates the same principle for a gas:
Question: During a phase transition, a material's temperature does not increase: for example, a solid melting into a liquid or a liquid freezing into a solid. Explain how this is consistent with the equations of entropy.
[[File:Phases.jpeg]]


Solution: Entropy satisfies <math>\frac{dS}{dE} = \frac{1}{T} </math>. When a solid melts into a liquid, there is obviously energy flowing into it, and all things held constant the liquid has more entropy than the solid. So as energy is being dumped in during the transition, entropy is increasing, so <math>\frac{dS}{dE} > 0 </math>. But notice that this does not mean the temperature is increasing. <math>T</math> can be a constant while entropy changes with respect to energy. So this is consistent.
* [https://phet.colorado.edu/sims/html/gas-properties/latest/gas-properties_en.html PhET Gas Properties Simulation] — drag particles around, change temperature and volume, and watch how the system relaxes toward the most probable (highest-multiplicity) state.


===Difficult===
A custom GlowScript / VPython model of two coupled Einstein solids is described in the "Suggested Computational Model" section below; it should be embedded here once written.


Question: Show that there are <math>\Omega</math> ways to distribute <math>q</math> indistinguishable quanta between <math>N</math> oscillators where <math>\Omega = \frac{(N+q-1)!}{q!(N-1)!} </math>
==Examples==


Solution:
===Simple===
For illustration, let us first pick the <math>N=3</math> and <math>q=2</math> case and imagine the oscillators as buckets and the quanta as dots. The way we can distribute the quanta into the oscillators is shown below schematically


[[File:Quanta_buckes.jpeg]]
'''Question:''' A small Einstein solid has ''N'' = 3 oscillators and ''q'' = 4 quanta of energy. How many microstates does the system have, and what is its entropy?


Instead of drawing it out, we can try and create a code that describes each configuration (triple bucket + dot set-up). Let's first imagine creating a "basis", something like this:
'''Solution:'''
Using the multiplicity formula,


[[File:Basis.jpeg]]
<math>\Omega = \frac{(q + N - 1)!}{q!\,(N - 1)!} = \frac{(4 + 3 - 1)!}{4!\,(3 - 1)!} = \frac{6!}{4!\,2!} = \frac{720}{24 \cdot 2} = 15</math>


With this basis, we could say that the above case 4 can be written as AB. Case 5 is BC and case 6 is AC. Keep in mind that since quanta are indistinguishable, AC and CA are equivalent and so are every other reordering. To designate two dots in one bucket, we can introduce a third letter, D. A D means to add another ball into the bucket in the combination with it, so case 1 is AD. All possible cases are AD, BD, CD, AB, BC, AC. Notice that these are all of the combinations of two letters from the set A,B,C,D.
The entropy is


Now let's take the more general case. If we have <math>N</math> buckets, we need at least <math>N</math> basis buckets. But, each bucket can have <math>q-1</math> extra dots inside, in addition to the single basis bucket. So, to represent all combinations, we need <math>N+q-1</math> "letters". In this scheme, we might say that each additional letter after the basis means to add an extra dot to the letter alphabetically the same with respect to its set to keep things unique. So if the basis is A,B,C then D means add an extra to A, and E would mean add an extra to C.
<math>S = k_B \ln \Omega = (1.38 \times 10^{-23}\,\text{J/K})\,\ln(15) \approx 3.74 \times 10^{-23}\,\text{J/K}</math>


The amount of ways to distribute the dots is now <math>N+q-1 \choose q </math>, since we can pick sets of <math>q</math> letters from the <math>N+q-1</math> possibilities.  
This is tiny because we only have three oscillators. Real solids have <math>\sim 10^{23}</math> oscillators, which is why measurable entropies are on the order of joules per kelvin.


<math>{N+q-1 \choose q} = \frac{(N+q-1)!}{q!(N-1)!}=\Omega</math>
===Middling===


===MATLAB Code===
'''Question:''' During a phase transition, a material's temperature does not increase even as energy is added — for example, while ice melts at 0 °C. Explain how this is consistent with the relation <math>1/T = \partial S / \partial E</math>.


%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
'''Solution:'''
During the transition, energy enters the system as latent heat, and the entropy increases because the liquid phase has many more accessible microstates than the solid phase (molecules are no longer locked into a lattice). Both ''S'' and ''E'' increase, but they increase together at a constant ratio set by the transition temperature, so


% You may change one of these 2 variables, keep in mind once q is over 100
<math>\frac{\partial S}{\partial E} = \frac{1}{T_{\text{melt}}} = \text{constant}</math>


% It may be hard to compute
The temperature stays fixed at the melting point until the entire phase transition is complete. The latent heat for melting ice, for instance, is


q = 10; % Quanta of energy
<math>L_f = 334 \text{ J/g}, \qquad \Delta S = \frac{L_f}{T} = \frac{334}{273.15} \approx 1.22 \text{ J/(g·K)}</math>


N = 3; % Oscillators
===Difficult===


'''Question:''' Two Einstein solids ''A'' and ''B'' are in thermal contact. Solid ''A'' has <math>N_A = 300</math> oscillators and solid ''B'' has <math>N_B = 200</math> oscillators. They share <math>q_{\text{total}} = 100</math> quanta. Find the most probable value of <math>q_A</math>, and show that at this point the two solids have the same temperature.


%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
'''Solution:'''
The total multiplicity is


f = @(q,N)((factorial(q + (N - 1)))/(factorial(q) * factorial(N-1)));
<math>\Omega_{\text{total}}(q_A) = \Omega_A(N_A, q_A)\,\Omega_B(N_B, q_{\text{total}} - q_A)</math>


y = [];
The most probable macrostate is the one that maximizes <math>\ln \Omega_{\text{total}}</math>. Taking the derivative with respect to <math>q_A</math> and using Stirling's approximation,


<math>\frac{\partial \ln \Omega_A}{\partial q_A} = \frac{\partial \ln \Omega_B}{\partial q_B}</math>


For large ''N'' and ''q'' the Einstein solid satisfies <math>\partial \ln \Omega / \partial q \approx \ln\!\big((q + N)/q\big)</math>, so the equilibrium condition is


for i = 1:q
<math>\frac{q_A + N_A}{q_A} = \frac{q_B + N_B}{q_B}</math>


y1 = f(i,N);
Solving with <math>q_A + q_B = 100</math>, <math>N_A = 300</math>, <math>N_B = 200</math> gives


y = [y y1];
<math>q_A = q_{\text{total}} \cdot \frac{N_A}{N_A + N_B} = 100 \cdot \frac{300}{500} = 60, \qquad q_B = 40</math>


end
The energy per oscillator is the same in both solids (<math>q_A/N_A = q_B/N_B = 0.2</math>), and since temperature in the Einstein model is a monotonic function of energy per oscillator, the two temperatures are equal — exactly as expected for thermal equilibrium.


x = [1:q];
==Suggested Computational Model==


plot(x,y);
Below is a description of a computational model that should be implemented in [https://www.glowscript.org/ GlowScript / VPython] and embedded into this page using a Trinket. The intended behavior is to simulate two Einstein solids exchanging energy quanta and to plot the resulting energy distribution alongside the theoretical multiplicity curve.


xlim([1 q]);
'''Goal:''' Visually demonstrate that two Einstein solids in thermal contact spontaneously evolve toward the macrostate of maximum total multiplicity (i.e., thermal equilibrium), and that the equilibrium distribution matches the theoretical prediction <math>q_A/N_A = q_B/N_B</math>.


ylim([0, y(end)]);
'''Inputs (constants the user can change at the top of the script):'''
* <code>N_A</code> — number of oscillators in solid A (e.g., 300)
* <code>N_B</code> — number of oscillators in solid B (e.g., 200)
* <code>q_total</code> — total energy quanta to distribute (e.g., 100)
* <code>n_steps</code> — number of Monte Carlo exchange steps (e.g., 50000)
* <code>initial_split</code> — fraction of quanta initially placed in solid A (e.g., 1.0, meaning all quanta start in A)


xlabel('q (Quanta of Energy)');
'''Algorithm:'''
# Create two arrays of length <code>N_A</code> and <code>N_B</code>, each entry representing the number of quanta on that oscillator. Distribute <code>q_total * initial_split</code> quanta randomly among the oscillators of solid A and the rest among solid B.
# At each Monte Carlo step:
## Pick a random oscillator in either solid that currently has at least one quantum.
## Pick a second random oscillator anywhere in either solid.
## Move one quantum from the first to the second. (This conserves <code>q_total</code> and respects the assumption that all microstates are equally likely.)
# Every <code>k</code> steps, record <code>q_A</code> (the total quanta currently in solid A) and update a live histogram of visited <code>q_A</code> values.
# In parallel, plot the theoretical curve <math>\Omega_A(N_A, q_A) \Omega_B(N_B, q_{\text{total}} - q_A)</math> normalized to the same area as the histogram.


ylabel('\Omega (Multiplicity)');
'''Visualization:'''
* Two side-by-side rectangular blocks rendered with VPython <code>box</code> objects, each colored on a heat-map scale according to its current energy per oscillator (blue = cold, red = hot).
* A live time series at the top showing <code>q_A(t)</code> approaching the equilibrium value <math>q_A^* = q_{\text{total}} N_A / (N_A + N_B)</math>.
* A live histogram at the bottom of visited <code>q_A</code> values, with the theoretical multiplicity curve overlaid as a smooth line.


title('Multiplicity vs. Energy Quanta');
'''Expected result:''' Even when all quanta start in solid A, within a few thousand steps the system relaxes to a sharply peaked distribution centered on <math>q_A^* = q_{\text{total}} N_A / (N_A + N_B)</math>, and the histogram matches the theoretical multiplicity curve. This is a direct visualization of the Second Law: the system finds the macrostate of maximum multiplicity.


box on
'''Embedding:''' Once the model is written on glowscript.org and converted to a Trinket, embed it on this page using the Trinket iframe snippet, e.g.


grid on;
<code>&lt;iframe src="https://trinket.io/embed/glowscript/XXXXXXXX" width="100%" height="600" frameborder="0" marginwidth="0" marginheight="0" allowfullscreen&gt;&lt;/iframe&gt;</code>


==Connectedness==
==Connectedness==
#How is this topic connected to something that you are interested in?


I am very interested in astrophysics, and there are very interesting cosmological analyses of entropy. For example, why do planets and stars form if they're more ordered than the interstellar medium they come from? Is the entropy of the universe really increasing? These topics have implications far beyond classical scales.
'''How is this topic connected to something I am interested in?'''
Entropy connects directly to information theory and machine learning. Claude Shannon's information entropy <math>H = -\sum p_i \log p_i</math> is mathematically identical (up to a constant) to Boltzmann's thermodynamic entropy. Modern machine learning models are routinely trained by minimizing cross-entropy, and the same statistical reasoning that explains why heat flows from hot to cold also explains why a well-trained model converges to the most probable explanation of the data.


#How is it connected to your major?
'''How is it connected to my major?'''
For engineering majors, the second law sets a hard upper limit on the efficiency of every heat engine and every refrigerator: the [https://en.wikipedia.org/wiki/Carnot_cycle Carnot efficiency] <math>\eta = 1 - T_C/T_H</math>. No amount of cleverness in design can beat this bound. For computer scientists, Landauer's principle states that erasing a single bit of information in an environment at temperature ''T'' must dissipate at least <math>k_B T \ln 2</math> of heat — a thermodynamic cost on computation itself.


I'm a Physics major.
'''Industrial applications:'''
Entropy and temperature are central to power generation (steam turbines, gas turbines, nuclear reactors), refrigeration and air conditioning, materials processing, chemical engineering (free energy and reaction spontaneity), and the design of low-power electronics where thermal management is the dominant constraint.


#Is there an interesting industrial application?
==History==


I'm a Physics major. But, I'll still answer: of course! You might say entropy is the opposing force of efficiency. In history, it was discovered in the context of heat loss with work by Carnot. If we could create infinite-machines, the world would have considerably less problems.
The concepts of temperature, entropy, and energy have been intertwined throughout the history of physics. Early notions of heat described it as a fluid (the "caloric"), and even Sir Isaac Newton speculated that heat might have mass. In 1824, the French engineer '''Sadi Carnot''' published ''Reflections on the Motive Power of Fire'', which established the upper limit on the efficiency of any heat engine — the first quantitative statement of the second law, written before entropy itself had been named.


Entropy is highly related to the concept of [[Quantized energy levels]]. The idea that energy can only be distributed in quanta between atoms is essential to understanding the idea of how many ways you can distribute energy in a system. If energy was continuous inside of atoms, there would be infinite ways to distribute energy into any system with more than one "oscillator" in the Einstein Model.
In the 1850s, '''Rudolf Clausius''' formalized the idea that thermodynamic processes irreversibly degrade energy. In 1865 he coined the word '''entropy''' from the Greek τροπή ("transformation"), explaining:


==History==
<blockquote>
<p>I prefer going to the ancient languages for the names of important scientific quantities, so that they may mean the same thing in all living tongues. I propose, therefore, to call ''S'' the entropy of a body, after the Greek word "transformation". I have designedly coined the word entropy to be similar to energy, for these two quantities are so analogous in their physical significance that an analogy of denominations seems to me helpful.</p>
</blockquote>
 
In the 1870s and 1880s, '''Ludwig Boltzmann''' established the microscopic foundation of entropy by connecting it to the number of accessible microstates, <math>S = k_B \ln \Omega</math> — the equation engraved on his tombstone in Vienna. '''Josiah Willard Gibbs''' then extended the framework into the modern statistical mechanics used today, treating entropy as a property of probability distributions over microstates.


The concepts of Temperature, Entropy and Energy have been linked throughout history. Historical notions of Heat described it as a particle, such as [[Sir Isaac Newton]] even believing it to have mass. In 1803, Lazare Carnot (father of [[Nicolas Leonard Sadi Carnot]]) formalized the idea that energy cannot be perfectly channeled: disorder is an intrinsic property of energy transformation. In the mid 1800s, [[Rudolf Clausius]] mathematically described a "transformation-content" of energy loss during any thermodynamic process. From the Greek word τροπή pronounced as "tropey" meaning "change", the prefix of "en"ergy was added onto the term when in 1865 entropy as we call it today was introduced. Clausius himself said <blockquote>I prefer going to the ancient languages for the names of important scientific quantities, so that they maymean the same thing in all living tongues. I propose, therefore, to call S the entropy of a body, after the Greek word "transformation". I have designedly coined the word entropy to be similar to energy, for these two quantities are so analogous in their physical significance, that an analogy of denominations seems to me helpful.</blockquote> Two decades later, Ludwig Boltzmann established the connection between entropy and the number of states of a system, introducing the equation we use today and the famous Boltzmann Constant: the first major idea introduced to the modern field of statistical thermodynamics. [1]
The 20th century added two more layers. In 1948, '''Claude Shannon''' showed that the same mathematical form governs information. In the 1970s, '''Jacob Bekenstein''' and '''Stephen Hawking''' showed that black holes themselves carry entropy proportional to their horizon area, tying thermodynamics to gravitation and quantum mechanics.


==Notable Scientists==
==Notable Scientists==
*[[Josiah Willard Gibbs]]
*[[James Prescott Joule]]
*[[Nicolas Leonard Sadi Carnot]]
*[[Rudolf Clausius]]
*[[Ludwig Boltzmann]]


== See also ==
* Sadi Carnot
* Rudolf Clausius
* James Prescott Joule
* Ludwig Boltzmann
* Josiah Willard Gibbs
* Claude Shannon
 
==See also==


All thermodynamics is linked to the concept of Energy. See also Heat, Temperature, Statistical Mechanics, Spontaneity. For Fun, see the [https://en.wikipedia.org/wiki/Heat_death_of_the_universe Heat Death of the Universe]
All thermodynamics is linked to the concept of [[Energy]]. See also [[Heat]], [[Temperature]], [[Statistical Mechanics]], [[Spontaneity]]. For an interesting cosmological perspective, see the [https://en.wikipedia.org/wiki/Heat_death_of_the_universe Heat Death of the Universe].


===Further reading===
===Further reading===


*[[Application of Statistics in Physics]]
* [[Application of Statistics in Physics]]
*[[Quantized energy levels]]
* [[Quantized energy levels]]
*[[Electronic Energy Levels and Photons]]
* [[Electronic Energy Levels and Photons]]
* Schroeder, D. V. ''An Introduction to Thermal Physics'' (Addison-Wesley, 2000) — chapters 2 and 3 give the Einstein-solid derivation used above.
* Reif, F. ''Fundamentals of Statistical and Thermal Physics'' (McGraw-Hill, 1965).


===External links===
===External links===


*[https://en.wikipedia.org/wiki/Energy Energy]
* [https://en.wikipedia.org/wiki/Entropy Wikipedia: Entropy]
*[https://en.wikipedia.org/wiki/Thermodynamic_equilibrium Thermodynamics]
* [https://en.wikipedia.org/wiki/Temperature Wikipedia: Temperature]
*[https://en.wikipedia.org/wiki/Conservation_of_energy Conservation of Energy]
* [https://en.wikipedia.org/wiki/Second_law_of_thermodynamics Wikipedia: Second Law of Thermodynamics]
*[https://fs.blog/entropy/ Entropy]
* [https://en.wikipedia.org/wiki/Einstein_solid Wikipedia: Einstein Solid]
* [https://phet.colorado.edu/sims/html/gas-properties/latest/gas-properties_en.html PhET: Gas Properties]
* [https://www.feynmanlectures.caltech.edu/I_44.html Feynman Lectures, Vol. I, Ch. 44 — The Laws of Thermodynamics]


==References==
==References==


1. “Entropy.Wikipedia, Wikimedia Foundation, 23 Apr. 2022, https://en.wikipedia.org/wiki/Entropy
# "Entropy." ''Wikipedia'', Wikimedia Foundation, https://en.wikipedia.org/wiki/Entropy
# "Temperature." ''Wikipedia'', Wikimedia Foundation, https://en.wikipedia.org/wiki/Temperature
# "Einstein solid." ''Wikipedia'', Wikimedia Foundation, https://en.wikipedia.org/wiki/Einstein_solid
# Schroeder, Daniel V. ''An Introduction to Thermal Physics''. Addison-Wesley, 2000.
# Carnot, Sadi. ''Reflections on the Motive Power of Fire''. 1824. Public-domain English translation: https://en.wikisource.org/wiki/Reflections_on_the_Motive_Power_of_Heat
# Bekenstein, J. D. "Black Holes and Entropy." ''Physical Review D'' 7, 2333 (1973).


[[Category:Statistical Physics]]
[[Category:Statistical Physics]]

Revision as of 19:48, 26 April 2026

Claimed by Ghood6 (Spring 2026)

The increase of ... entropy is what distinguishes the past from the future, giving a direction to time.

— Stephen Hawking

Temperature and entropy are two of the most important concepts in thermal physics. Temperature is a measure of the average kinetic energy associated with the microscopic motion of the particles in a system, and it determines the direction of spontaneous energy transfer between two systems in thermal contact. Entropy is a quantitative measure of the number of microscopic arrangements (microstates) that correspond to a given macroscopic state of a system. The two are deeply linked: temperature is precisely the inverse of how rapidly the entropy of a system grows when energy is added to it.

What makes entropy so important is its connection to the Second Law of Thermodynamics, which states:

The total entropy of an isolated system never decreases over time.

Entropy is highly related to Application of Statistics in Physics, since it is fundamentally a statistical quantity. The Second Law follows from the simple fact that, when many particles share energy, configurations with more microstates are vastly more probable than configurations with few. On macroscopic scales, the probability of spontaneous entropy decrease is so small that it is effectively never observed. The concept of entropy is famous for its proof against the existence of perpetual motion machines of the second kind.

The Main Idea

Temperature, energy, and entropy together describe the thermal behavior of any system in equilibrium. Imagine two solid blocks at different temperatures placed in contact. Energy will flow from the hotter block to the cooler one, not because energy "knows" which way to go, but because the combined system has many more ways to arrange its energy when it is shared between the blocks at a common temperature than when it is concentrated in one of them. Temperature emerges naturally as the quantity that equalizes when this process is complete.

The microscopic picture used most often in introductory physics is the Einstein model of a solid. Each atom in the solid is treated as three independent quantum harmonic oscillators (one for each spatial direction), each able to hold energy only in discrete quanta of size [math]\displaystyle{ q = \hbar \omega_0 }[/math]. The macroscopic properties of the solid — its temperature, its heat capacity, and its entropy — are then computed by counting the number of ways the available energy quanta can be distributed among the oscillators.

A Mathematical Model

The fundamental thermodynamic definition of temperature is

[math]\displaystyle{ \frac{1}{T} = \frac{\partial S}{\partial E} }[/math]

where S is the entropy of the system and E is its internal energy. In words, temperature measures how much the entropy of a system increases per unit of energy added to it. A system that gains a lot of entropy from a small amount of added energy is "cold"; a system whose entropy barely changes when energy is added is "hot."

Boltzmann's statistical definition of entropy is

[math]\displaystyle{ S = k_B \ln \Omega }[/math]

where [math]\displaystyle{ \Omega }[/math] is the number of microstates consistent with the macrostate of the system, and [math]\displaystyle{ k_B = 1.38 \times 10^{-23} \text{ J/K} }[/math] is Boltzmann's constant.

For an Einstein solid containing N independent quantum oscillators sharing q indistinguishable energy quanta, the multiplicity is

[math]\displaystyle{ \Omega(N, q) = \frac{(q + N - 1)!}{q!\,(N - 1)!} }[/math]

For two Einstein solids in thermal contact with qtotal quanta shared between them, the most probable macrostate is the one that maximizes [math]\displaystyle{ \Omega_1(N_1, q_1)\,\Omega_2(N_2, q_2) }[/math] subject to [math]\displaystyle{ q_1 + q_2 = q_{\text{total}} }[/math]. Setting the derivative with respect to q1 to zero gives

[math]\displaystyle{ \frac{\partial S_1}{\partial E_1} = \frac{\partial S_2}{\partial E_2} \quad\Rightarrow\quad T_1 = T_2 }[/math]

which is exactly the condition of thermal equilibrium. This is the microscopic reason temperatures equalize.

The change in entropy for a reversible process is given by the Clausius definition,

[math]\displaystyle{ dS = \frac{dQ_{\text{rev}}}{T} }[/math]

and integrating between two states gives the macroscopic entropy change. For an ideal gas undergoing a reversible isothermal expansion from volume V1 to V2, this becomes

[math]\displaystyle{ \Delta S = n R \ln\!\left(\frac{V_2}{V_1}\right) }[/math]

A Computational Model

A useful way to build intuition for entropy is to simulate two Einstein solids in thermal contact and watch the most probable energy distribution emerge from random exchanges of quanta. The PhET simulation below also illustrates the same principle for a gas:

  • PhET Gas Properties Simulation — drag particles around, change temperature and volume, and watch how the system relaxes toward the most probable (highest-multiplicity) state.

A custom GlowScript / VPython model of two coupled Einstein solids is described in the "Suggested Computational Model" section below; it should be embedded here once written.

Examples

Simple

Question: A small Einstein solid has N = 3 oscillators and q = 4 quanta of energy. How many microstates does the system have, and what is its entropy?

Solution: Using the multiplicity formula,

[math]\displaystyle{ \Omega = \frac{(q + N - 1)!}{q!\,(N - 1)!} = \frac{(4 + 3 - 1)!}{4!\,(3 - 1)!} = \frac{6!}{4!\,2!} = \frac{720}{24 \cdot 2} = 15 }[/math]

The entropy is

[math]\displaystyle{ S = k_B \ln \Omega = (1.38 \times 10^{-23}\,\text{J/K})\,\ln(15) \approx 3.74 \times 10^{-23}\,\text{J/K} }[/math]

This is tiny because we only have three oscillators. Real solids have [math]\displaystyle{ \sim 10^{23} }[/math] oscillators, which is why measurable entropies are on the order of joules per kelvin.

Middling

Question: During a phase transition, a material's temperature does not increase even as energy is added — for example, while ice melts at 0 °C. Explain how this is consistent with the relation [math]\displaystyle{ 1/T = \partial S / \partial E }[/math].

Solution: During the transition, energy enters the system as latent heat, and the entropy increases because the liquid phase has many more accessible microstates than the solid phase (molecules are no longer locked into a lattice). Both S and E increase, but they increase together at a constant ratio set by the transition temperature, so

[math]\displaystyle{ \frac{\partial S}{\partial E} = \frac{1}{T_{\text{melt}}} = \text{constant} }[/math]

The temperature stays fixed at the melting point until the entire phase transition is complete. The latent heat for melting ice, for instance, is

[math]\displaystyle{ L_f = 334 \text{ J/g}, \qquad \Delta S = \frac{L_f}{T} = \frac{334}{273.15} \approx 1.22 \text{ J/(g·K)} }[/math]

Difficult

Question: Two Einstein solids A and B are in thermal contact. Solid A has [math]\displaystyle{ N_A = 300 }[/math] oscillators and solid B has [math]\displaystyle{ N_B = 200 }[/math] oscillators. They share [math]\displaystyle{ q_{\text{total}} = 100 }[/math] quanta. Find the most probable value of [math]\displaystyle{ q_A }[/math], and show that at this point the two solids have the same temperature.

Solution: The total multiplicity is

[math]\displaystyle{ \Omega_{\text{total}}(q_A) = \Omega_A(N_A, q_A)\,\Omega_B(N_B, q_{\text{total}} - q_A) }[/math]

The most probable macrostate is the one that maximizes [math]\displaystyle{ \ln \Omega_{\text{total}} }[/math]. Taking the derivative with respect to [math]\displaystyle{ q_A }[/math] and using Stirling's approximation,

[math]\displaystyle{ \frac{\partial \ln \Omega_A}{\partial q_A} = \frac{\partial \ln \Omega_B}{\partial q_B} }[/math]

For large N and q the Einstein solid satisfies [math]\displaystyle{ \partial \ln \Omega / \partial q \approx \ln\!\big((q + N)/q\big) }[/math], so the equilibrium condition is

[math]\displaystyle{ \frac{q_A + N_A}{q_A} = \frac{q_B + N_B}{q_B} }[/math]

Solving with [math]\displaystyle{ q_A + q_B = 100 }[/math], [math]\displaystyle{ N_A = 300 }[/math], [math]\displaystyle{ N_B = 200 }[/math] gives

[math]\displaystyle{ q_A = q_{\text{total}} \cdot \frac{N_A}{N_A + N_B} = 100 \cdot \frac{300}{500} = 60, \qquad q_B = 40 }[/math]

The energy per oscillator is the same in both solids ([math]\displaystyle{ q_A/N_A = q_B/N_B = 0.2 }[/math]), and since temperature in the Einstein model is a monotonic function of energy per oscillator, the two temperatures are equal — exactly as expected for thermal equilibrium.

Suggested Computational Model

Below is a description of a computational model that should be implemented in GlowScript / VPython and embedded into this page using a Trinket. The intended behavior is to simulate two Einstein solids exchanging energy quanta and to plot the resulting energy distribution alongside the theoretical multiplicity curve.

Goal: Visually demonstrate that two Einstein solids in thermal contact spontaneously evolve toward the macrostate of maximum total multiplicity (i.e., thermal equilibrium), and that the equilibrium distribution matches the theoretical prediction [math]\displaystyle{ q_A/N_A = q_B/N_B }[/math].

Inputs (constants the user can change at the top of the script):

  • N_A — number of oscillators in solid A (e.g., 300)
  • N_B — number of oscillators in solid B (e.g., 200)
  • q_total — total energy quanta to distribute (e.g., 100)
  • n_steps — number of Monte Carlo exchange steps (e.g., 50000)
  • initial_split — fraction of quanta initially placed in solid A (e.g., 1.0, meaning all quanta start in A)

Algorithm:

  1. Create two arrays of length N_A and N_B, each entry representing the number of quanta on that oscillator. Distribute q_total * initial_split quanta randomly among the oscillators of solid A and the rest among solid B.
  2. At each Monte Carlo step:
    1. Pick a random oscillator in either solid that currently has at least one quantum.
    2. Pick a second random oscillator anywhere in either solid.
    3. Move one quantum from the first to the second. (This conserves q_total and respects the assumption that all microstates are equally likely.)
  3. Every k steps, record q_A (the total quanta currently in solid A) and update a live histogram of visited q_A values.
  4. In parallel, plot the theoretical curve [math]\displaystyle{ \Omega_A(N_A, q_A) \Omega_B(N_B, q_{\text{total}} - q_A) }[/math] normalized to the same area as the histogram.

Visualization:

  • Two side-by-side rectangular blocks rendered with VPython box objects, each colored on a heat-map scale according to its current energy per oscillator (blue = cold, red = hot).
  • A live time series at the top showing q_A(t) approaching the equilibrium value [math]\displaystyle{ q_A^* = q_{\text{total}} N_A / (N_A + N_B) }[/math].
  • A live histogram at the bottom of visited q_A values, with the theoretical multiplicity curve overlaid as a smooth line.

Expected result: Even when all quanta start in solid A, within a few thousand steps the system relaxes to a sharply peaked distribution centered on [math]\displaystyle{ q_A^* = q_{\text{total}} N_A / (N_A + N_B) }[/math], and the histogram matches the theoretical multiplicity curve. This is a direct visualization of the Second Law: the system finds the macrostate of maximum multiplicity.

Embedding: Once the model is written on glowscript.org and converted to a Trinket, embed it on this page using the Trinket iframe snippet, e.g.

<iframe src="https://trinket.io/embed/glowscript/XXXXXXXX" width="100%" height="600" frameborder="0" marginwidth="0" marginheight="0" allowfullscreen></iframe>

Connectedness

How is this topic connected to something I am interested in? Entropy connects directly to information theory and machine learning. Claude Shannon's information entropy [math]\displaystyle{ H = -\sum p_i \log p_i }[/math] is mathematically identical (up to a constant) to Boltzmann's thermodynamic entropy. Modern machine learning models are routinely trained by minimizing cross-entropy, and the same statistical reasoning that explains why heat flows from hot to cold also explains why a well-trained model converges to the most probable explanation of the data.

How is it connected to my major? For engineering majors, the second law sets a hard upper limit on the efficiency of every heat engine and every refrigerator: the Carnot efficiency [math]\displaystyle{ \eta = 1 - T_C/T_H }[/math]. No amount of cleverness in design can beat this bound. For computer scientists, Landauer's principle states that erasing a single bit of information in an environment at temperature T must dissipate at least [math]\displaystyle{ k_B T \ln 2 }[/math] of heat — a thermodynamic cost on computation itself.

Industrial applications: Entropy and temperature are central to power generation (steam turbines, gas turbines, nuclear reactors), refrigeration and air conditioning, materials processing, chemical engineering (free energy and reaction spontaneity), and the design of low-power electronics where thermal management is the dominant constraint.

History

The concepts of temperature, entropy, and energy have been intertwined throughout the history of physics. Early notions of heat described it as a fluid (the "caloric"), and even Sir Isaac Newton speculated that heat might have mass. In 1824, the French engineer Sadi Carnot published Reflections on the Motive Power of Fire, which established the upper limit on the efficiency of any heat engine — the first quantitative statement of the second law, written before entropy itself had been named.

In the 1850s, Rudolf Clausius formalized the idea that thermodynamic processes irreversibly degrade energy. In 1865 he coined the word entropy from the Greek τροπή ("transformation"), explaining:

I prefer going to the ancient languages for the names of important scientific quantities, so that they may mean the same thing in all living tongues. I propose, therefore, to call S the entropy of a body, after the Greek word "transformation". I have designedly coined the word entropy to be similar to energy, for these two quantities are so analogous in their physical significance that an analogy of denominations seems to me helpful.

In the 1870s and 1880s, Ludwig Boltzmann established the microscopic foundation of entropy by connecting it to the number of accessible microstates, [math]\displaystyle{ S = k_B \ln \Omega }[/math] — the equation engraved on his tombstone in Vienna. Josiah Willard Gibbs then extended the framework into the modern statistical mechanics used today, treating entropy as a property of probability distributions over microstates.

The 20th century added two more layers. In 1948, Claude Shannon showed that the same mathematical form governs information. In the 1970s, Jacob Bekenstein and Stephen Hawking showed that black holes themselves carry entropy proportional to their horizon area, tying thermodynamics to gravitation and quantum mechanics.

Notable Scientists

  • Sadi Carnot
  • Rudolf Clausius
  • James Prescott Joule
  • Ludwig Boltzmann
  • Josiah Willard Gibbs
  • Claude Shannon

See also

All thermodynamics is linked to the concept of Energy. See also Heat, Temperature, Statistical Mechanics, Spontaneity. For an interesting cosmological perspective, see the Heat Death of the Universe.

Further reading

External links

References

  1. "Entropy." Wikipedia, Wikimedia Foundation, https://en.wikipedia.org/wiki/Entropy
  2. "Temperature." Wikipedia, Wikimedia Foundation, https://en.wikipedia.org/wiki/Temperature
  3. "Einstein solid." Wikipedia, Wikimedia Foundation, https://en.wikipedia.org/wiki/Einstein_solid
  4. Schroeder, Daniel V. An Introduction to Thermal Physics. Addison-Wesley, 2000.
  5. Carnot, Sadi. Reflections on the Motive Power of Fire. 1824. Public-domain English translation: https://en.wikisource.org/wiki/Reflections_on_the_Motive_Power_of_Heat
  6. Bekenstein, J. D. "Black Holes and Entropy." Physical Review D 7, 2333 (1973).