Entropy

From Physics Book
Jump to navigation Jump to search

The Main Idea

Entropy is a measure of disorder, randomness, or the number of ways a system's energy can be distributed. Scientifically, entropy quantifies the number of possible microstates (specific configurations) that correspond to a macrostate (observable state). Higher entropy corresponds to greater energy dispersal and higher disorder.

Error creating thumbnail: sh: /usr/bin/convert: No such file or directory Error code: 127
Representation of energy quanta distributed among three wells, depicting the three spatial directions an atom can store energy. Ω = 6 Ω=6.

A system typically refers to particles (e.g., atoms) that store energy in quantized units, called quanta ( 𝑞 q). Each atom has discrete energy levels in three spatial directions: 𝑥 x, 𝑦 y, and 𝑧 z. Imagine an atom as having three "wells," each capable of storing energy balls. For instance:

1 quanta of energy can be stored in three ways (one well holds the ball, while the other two are empty). 2 quanta can be distributed in Ω = 6 Ω=6 ways across the wells. This idea extends to multiple atoms. For example:

1 quanta shared between 2 atoms has Ω = 6 Ω=6 distributions. 2 quanta shared between 2 atoms has Ω = 21 Ω=21 distributions. The number of possible configurations for distributing energy is denoted Ω Ω.

Microstates and Macrostates

A macrostate represents the total energy ( 𝑞 q) and number of particles in a system. A microstate represents one specific way the energy is distributed. For a system of two subsystems, the total number of microstates is the product of individual subsystem microstates: Ω Total = Ω 1 ⋅ Ω 2 Ω Total ​

1 ​

⋅Ω 

2 ​


Fundamental Assumption of Statistical Mechanics

The Fundamental Assumption of Statistical Mechanics posits that, over time, all microstates of an isolated system are equally likely. Consequently:

The most probable macrostate is the one with the highest number of microstates. Consider the case of 4 quanta distributed between 2 atoms:

Error creating thumbnail: sh: /usr/bin/convert: No such file or directory Error code: 127
Graph showing possible energy distributions and their probabilities.

There are Ω = 36 Ω=36 microstates where energy is evenly split, out of Ω Total = 126 Ω Total ​

=126 microstates.

The even split is most probable ( 29 % 29%), aligning with experimental data. As systems grow larger (e.g., 𝑞 > 100 q>100 and 𝑛 > 100 n>100), microstate numbers become astronomical ( Ω > 1 0 100 Ω>10 100

). For macroscopic systems (

𝑛 > 1 0 20 n>10 20

), the most probable distribution overwhelmingly dominates.

Mathematical Definition of Entropy

Entropy ( 𝑆 S) provides a practical way to quantify disorder. It is defined using Ω Ω: 𝑆 = 𝑘 𝐵 ln ⁡ ( Ω ) S=k B ​

ln(Ω) Where:

𝑘 𝐵 k B ​

 is the Boltzmann constant (

1.38 × 1 0 − 23   J/K 1.38×10 −23

J/K).

ln ⁡ ln is the natural logarithm. For two systems: 𝑆 Total = 𝑘 𝐵 ln ⁡ ( Ω 1 ⋅ Ω 2 ) = 𝑘 𝐵 ln ⁡ ( Ω 1 ) + 𝑘 𝐵 ln ⁡ ( Ω 2 ) S Total ​

=k 

B ​

ln(Ω 

1 ​

⋅Ω 

2 ​

)=k 

B ​

ln(Ω 

1 ​

)+k 

B ​

ln(Ω 

2 ​

)

Second Law of Thermodynamics

The Second Law of Thermodynamics states that entropy of a closed system tends to increase over time until it reaches equilibrium. Consider two systems:

System 1: Few atoms, high energy. System 2: Many atoms, low energy. Energy flows from System 1 to System 2 because Δ Ω 2 ≫ Δ Ω 1 ΔΩ 2 ​

≫ΔΩ 

1 ​

, resulting in 

Δ 𝑆 Total > 0 ΔS Total ​

>0. Entropy increases, and the system approaches equilibrium.

Entropy and Temperature

Entropy relates closely to temperature ( 𝑇 T). Temperature measures energy distribution across a system's microstates and is defined as: 1 𝑇 = ∂ 𝑆 ∂ 𝑈 T 1 ​

= 

∂U ∂S ​

 Where 

𝑈 U is the internal energy.

This relation explains why heat capacity ( 𝐶 = Δ 𝑈 Δ 𝑇 C= ΔT ΔU ​

) varies across materials and phases. For single atoms, specific heat provides insight into their microscopic energy distribution.

Applications of Entropy

Thermodynamics: Predicting heat flow and energy transformations. Statistical Mechanics: Explaining macroscopic phenomena like gas diffusion. Information Theory: Measuring uncertainty in data streams. Cosmology: Understanding entropy's role in black holes and the universe's evolution. Quantum Mechanics: Entanglement and quantum information processing. Entropy bridges the microscopic world of particles to the macroscopic phenomena we observe, revealing profound insights into the nature of energy, order, and time.

File:Second law energy flow.png
Illustration of energy flow between two systems, increasing total entropy.

A Mathematical Model

1 quanta of energy or q = 1 is different for every different type of atom and is found using:

  • [math]\displaystyle{ ħ\sqrt{\frac{4*k_{s,i}}{m_a}} }[/math]
    • Where ħ is the Planck constant, [math]\displaystyle{ k_{s,i} }[/math] is the interatomic spring stiffness, and [math]\displaystyle{ m_a }[/math] is the mass of the atom.
    • This value is measured in Joules

The number of microstates for a given macrostate is found using:

  • [math]\displaystyle{ Ω = \frac{(q + N - 1)!}{q!*(N - 1)!} }[/math]
    • Where N the number of energy wells in the system or 3 * # of atoms in the system and ! represents the factorial mathematical function

Entropy is found using:

  • [math]\displaystyle{ S = k_B * ln(Ω) }[/math]
    • Where (The Boltzmann constant) [math]\displaystyle{ k_B = 1.38 * 10^{-23} }[/math]

Temperature (in Kelvin) is defined as:

  • [math]\displaystyle{ \frac{1}{T} = \frac{\partial S}{\partial E_{Internal}} }[/math]
    • Where [math]\displaystyle{ E_{Internal} = ħ\sqrt{\frac{k_{s,i}}{m_a}} }[/math]
      • This is preferred to [math]\displaystyle{ \frac{\partial S}{\partial q} }[/math] because two objects of different materials can have the same q value but be storing different quantities of energy. For the direct comparison this relationship is useful for, universality must be maintained.

Specific heat is found using:

  • [math]\displaystyle{ C = \frac{∆E_{Atom}}{∆T} }[/math]
    • For macroscopic bodies use: [math]\displaystyle{ C = \frac{∆E_{System}}{∆T*N} }[/math]
      • Where N is the number of atoms in the system.

A Computational Model

How to calculate entropy with given n and q values:

https://trinket.io/embed/glowscript/c24ad7936e

Examples

Simple

1) How much energy in Joules is stored in a cluster of 8 copper atoms containing 7 energy quanta given that the interatomic spring stiffness for copper is 7 N/m?

The energy within is equal to 7 times the joule value of one copper energy quantum:

  • [math]\displaystyle{ E = 7 * q_{Copper} }[/math]

Plug in given values:

  • [math]\displaystyle{ E = 7 * ħ \sqrt\frac{4*7}{\frac{.063}{6.022 * 10^{23}}} }[/math]
    • Dividing the atomic mass of copper by Avogadro's number yields the mass of one copper atom

Solve:

  • [math]\displaystyle{ E = 1.208*10^{-20} }[/math] Joules

__________________________________________________________________________________________________________________________________________________

2) Calculate [math]\displaystyle{ Ω_{Total} }[/math] for a system of 2 nanoparticles, one containing 5 energy quanta and 4 atoms and the second containing 3 energy quanta and 6 atoms.

Calculate [math]\displaystyle{ Ω_1 }[/math]:

  • [math]\displaystyle{ Ω_1 = \frac{16!}{5!11!} = 4,368 }[/math]
    • For [math]\displaystyle{ Ω_1, }[/math] N = 12

Calculate [math]\displaystyle{ Ω_2 }[/math]:

  • [math]\displaystyle{ Ω_2 = \frac{20!}{3!17!} = 1,140 }[/math]
    • For [math]\displaystyle{ Ω_2, }[/math] N = 18

Calculate [math]\displaystyle{ Ω_{Total} }[/math]:

  • [math]\displaystyle{ Ω_{Total} = Ω_1 * Ω_2 = 4,979,520 }[/math]

Middling

For a nanoparticle of 5 lead atoms ([math]\displaystyle{ k_{s,i} = 5 }[/math] N/m), what is the approximate temperature when 6 quanta of energy are stored within the nanoparticle?


Calculate the joule value of one quantum of energy for lead:

  • [math]\displaystyle{ E = ħ\sqrt{\frac{4*5}{\frac{.207}{6.022*10^{23}}}} = 8.044 * 10^{-22} }[/math] Joules


Calculate the entropy of the nano particle when it contains 5 and 7 quanta of energy:

  • [math]\displaystyle{ Ω_5 = \frac{19!}{5!14!} = 11628 }[/math]
  • [math]\displaystyle{ S = k_B * ln(11628) = k_B * 9.361 }[/math]


  • [math]\displaystyle{ Ω_7 = \frac{21!}{7!14!} = 116280 }[/math]
  • [math]\displaystyle{ S = k_B * ln(116280) = k_B * 11.664 }[/math]


Derive an approximate formula for T:

  • [math]\displaystyle{ \frac{1}{T} = \frac{\partial S}{\partial E} }[/math]
  • [math]\displaystyle{ T = \frac{\partial E}{\partial S} }[/math]
  • [math]\displaystyle{ T ≈ \frac{∆E}{∆S} }[/math]


Solve for T:

  • [math]\displaystyle{ T ≈ \frac{2E}{k_B * (ln(Ω_7) - ln(Ω_5))} }[/math]
  • [math]\displaystyle{ T ≈ 50.621 K }[/math]
    • Because the relation is a derivative, to find T we must find the slope of a hypothetical E vs S graph at the value of 6 energy quanta. The easiest way to do this is to find the average slope of a region containing the value of 6 energy quanta at the center. As we use the region between q = 5 and q = 7, the change in E is equal to 2 energy quanta for lead.

Difficult

What is the specific heat of a cluster of 3 copper atoms containing 4 quanta of energy?


Calculate the joule value of one quantum of energy for copper:

  • [math]\displaystyle{ E = ħ\sqrt{\frac{4*7}{\frac{.063}{6.022*10^{23}}}} = 1.725 * 10^{-21} }[/math] Joules


Calculate the entropy of the nano particle when it contains 3, 4 and, 5 quanta of energy:

  • [math]\displaystyle{ Ω_3 = \frac{11!}{3!8!} = 165 }[/math]
  • [math]\displaystyle{ S = k_B * ln(165) = k_B * 9.361 }[/math]


  • [math]\displaystyle{ Ω_4 = \frac{12!}{4!8!} = 495 }[/math]
  • [math]\displaystyle{ S = k_B * ln(495) = k_B * 11.664 }[/math]


  • [math]\displaystyle{ Ω_5 = \frac{13!}{5!8!} = 1287 }[/math]
  • [math]\displaystyle{ S = k_B * ln(1287) = k_B * 11.664 }[/math]


Derive an approximate formula for T:

  • [math]\displaystyle{ \frac{1}{T} = \frac{\partial S}{\partial E} }[/math]
  • [math]\displaystyle{ T = \frac{\partial E}{\partial S} }[/math]
  • [math]\displaystyle{ T ≈ \frac{∆E}{∆S} }[/math]


Solve for T at 3.5 and 4.5 quanta of energy:

  • [math]\displaystyle{ T_{3.5} ≈ \frac{E}{k_B * (ln(Ω_4) - ln(Ω_3))} }[/math]
  • [math]\displaystyle{ T_{3.5} ≈ 113.74 K }[/math]


  • [math]\displaystyle{ T_{4.5} ≈ \frac{E}{k_B * (ln(Ω_5) - ln(Ω_4))} }[/math]
  • [math]\displaystyle{ T_{4.5} ≈ 130.89 K }[/math]


Solve for specific heat:

  • [math]\displaystyle{ C = \frac{1}{N}\frac{∆E}{∆T} = \frac{1}{3}\frac{E}{T_{4.5} - T_{3.5}} }[/math]
  • [math]\displaystyle{ C = 3.353 * 10^{-23} }[/math] J/K

Connectedness

In my research I read that entropy is known as time's arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a "heat death" and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again.

The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business.

My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.

History

Entropy was formally named from Greek en + tropē meaning "transformation content" by German physicist Rudolf Clausius in the 1850's. The study of entropy grew from the observation that energy is always lost to friction and dissipation in engines. Entropy was the formal name given to this lost energy by Clausius when he began formulating the first thermodynamical systems.

The concept of entropy was then expanded on by Ludwig Boltzmann who provided the rigorous mathematical definition used today by framing the question in terms of statistical mechanics. Surprisingly, it was not Boltzmann who incorporated the previously found Boltzmann constant ([math]\displaystyle{ k_B }[/math]) into the definition, but rather J. Willard Gibbs.

The study of entropy has been used in numerous applications since its inception. Erwin Schrödinger used the concept of entropy to explain the remarkably low replication error of DNA structures in living beings in his book "What is Life?". Entropy also has numerous parallels in the study of informational theory regarding information lost in transmission and broadcasting.

See also

Here is a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.

External links

Great TED-ED on the subject:

References