Temperature & Entropy: Difference between revisions
No edit summary |
No edit summary |
||
Line 1: | Line 1: | ||
Claimed by Josh Brandt (April 19th, Spring 2022) | Claimed by Josh Brandt (April 19th, Spring 2022) | ||
{{ | {{quote|text=“The increase of ... entropy is what distinguishes the past from the future, giving a direction to time.”|person|source}} | ||
“The increase of ... entropy is what distinguishes the past from the future, giving a direction to time.” - Stephen Hawking | “The increase of ... entropy is what distinguishes the past from the future, giving a direction to time.” - Stephen Hawking |
Revision as of 17:29, 21 April 2022
Claimed by Josh Brandt (April 19th, Spring 2022)
“The increase of ... entropy is what distinguishes the past from the future, giving a direction to time.” - Stephen Hawking
The Main Idea
A Mathematical Model
The fundamental relationship between Temperature [math]\displaystyle{ T }[/math], Energy [math]\displaystyle{ E }[/math] and Entropy [math]\displaystyle{ S \equiv k_B \ln\Omega }[/math] is [math]\displaystyle{ \frac{dS}{dE}=\frac{1}{T} }[/math].
In order to understand and predict the behavior of solids, we can employ the Einstein Model. This simple model treats interatomic Coulombic force as a spring, justified by the often highly parabolic potential energy curve near the equilibrium in models of interatomic attraction (see Morse Potential, Buckingham Potential, and Lennard-Jones potential). In this way, one can imagine a cubic lattice of spring-interconnected atoms, with an imaginary cube centered around each one slicing each and every spring in half.
A quantum mechanical harmonic oscillator has quantized energy states, with one quanta being a unit of energy [math]\displaystyle{ q=\hbar \omega_0 }[/math]. These quanta can be distributed among a solid by going into anyone of the three half-springs attached to each atom. In fact, the number of ways to distribute [math]\displaystyle{ q }[/math] quanta of energy between [math]\displaystyle{ N }[/math] oscillators is [math]\displaystyle{ \Omega = \frac{(q+N-1)!}{q!(N-1)!} }[/math]. From this, it is clear that entropy is intrinsically related to the number of ways to distribute energy in a system.
From here, it is almost intuitive that systems should evolve to increased entropy over time. Following the fundamental postulate of thermodynamics: in the long-term steady state, all microstates have equal probability, we can see that a macrostate which includes more microstates is the most likely endpoint for a sufficiently time-evolved system. Having many microstates implies having many different ways to distribute energy, and thus high entropy. It should make sense that the Second Law of Thermodynamics states that "the entropy of an isolated system tends to increase through time".
The treatment of a large number of particle is the study of Statistical Mechanics, whose applications to Physics are discussed in another page. It is apparent that Thermodynamics and Statistical Mechanics have become linked as the atomic behavior behind thermodynamic properties was discovered.
A Computational Model
Examples
Be sure to show all steps in your solution and include diagrams whenever possible
Simple
Middling
Difficult
Connectedness
- How is this topic connected to something that you are interested in?
I am very interested in astrophysics, and there are very interesting cosmological analyses of entropy. For example, why do planets and stars form if they're more ordered than the interstellar medium they come from? Is the entropy of the universe really increasing? These topics have implications far beyond classical scales.
- How is it connected to your major?
I'm a Physics major.
- Is there an interesting industrial application?
I'm a Physics major. But, I'll still answer: of course! You might say entropy is the opposing force of efficiency. In history, it was discovered in the context of heat loss with work by Carnot. If we could create infinite-machines, the world would have considerably less problems.
History
The concepts of Temperature, Entropy and Energy have been linked throughout history. Historical notions of Heat described it as a particle, such as Sir Isaac Newton even believing it to have mass. In 1803, Lazare Carnot (father of Nicolas Leonard Sadi Carnot) formalized the idea that energy cannot be perfectly channeled: disorder is an intrinsic property of energy transformation. In the mid 1800s, Rudolf Clausius mathematically described a "transformation-content" of energy loss during any thermodynamic process. From the Greek word τροπή pronounced as "tropey" meaning "change", the prefix of "en"ergy was added onto the term when in 1865 entropy as we call it today was introduced. Clausius himself said "I prefer going to the ancient languages for the names of important scientific quantities, so that they may mean the same thing in all living tongues. I propose, therefore, to call S the entropy of a body, after the Greek word "transformation". I have designedly coined the word entropy to be similar to energy, for these two quantities are so analogous in their physical significance, that an analogy of denominations seems to me helpful." Two decades later, Ludwig Boltzmann established the connection between entropy and the number of states of a system, introducing the equation we use today and the famous Boltzmann Constant: the first major idea introduced to the modern field of statistical thermodynamics. (Wikipedia)
Notable Scientists
Josiah Willard Gibbs James Prescott Joule Nicolas Leonard Sadi Carnot Rudolf Clausius
See also
All thermodynamics is linked to the concept of Energy. See also Heat, Temperature, Statistical Mechanics, Spontaneity. For Fun, see "Heat Death of the Universe": https://en.wikipedia.org/wiki/Heat_death_of_the_universe
Further reading
External links
https://en.wikipedia.org/wiki/Energy https://en.wikipedia.org/wiki/Thermodynamic_equilibrium https://en.wikipedia.org/wiki/Conservation_of_energy https://fs.blog/entropy/
References
This section contains the the references you used while writing this page