<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>http://www.physicsbook.gatech.edu/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Ctyler9</id>
	<title>Physics Book - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="http://www.physicsbook.gatech.edu/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Ctyler9"/>
	<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/Special:Contributions/Ctyler9"/>
	<updated>2026-04-30T02:40:28Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.42.7</generator>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30776</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30776"/>
		<updated>2017-11-30T02:59:22Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* A Computational Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot; (Ted Talk)&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How to calculate entropy with given n and q values:&lt;br /&gt;
&lt;br /&gt;
https://trinket.io/embed/glowscript/c24ad7936e&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile1.png]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile3.png]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile2.png]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
*Chabay, Ruth W., and Bruce A. Sherwood. Matter &amp;amp; Interactions. John Wiley &amp;amp; Sons, 2015.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30775</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30775"/>
		<updated>2017-11-30T02:59:00Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* A Computational Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot; (Ted Talk)&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
https://trinket.io/embed/glowscript/c24ad7936e&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile1.png]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile3.png]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile2.png]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
*Chabay, Ruth W., and Bruce A. Sherwood. Matter &amp;amp; Interactions. John Wiley &amp;amp; Sons, 2015.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30773</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30773"/>
		<updated>2017-11-30T02:58:46Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* A Computational Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot; (Ted Talk)&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
[[embed]]&amp;lt;iframe src=&amp;quot;https://trinket.io/embed/glowscript/c24ad7936e&amp;quot; width=&amp;quot;100%&amp;quot; height=&amp;quot;356&amp;quot; frameborder=&amp;quot;0&amp;quot; marginwidth=&amp;quot;0&amp;quot; marginheight=&amp;quot;0&amp;quot; allowfullscreen&amp;gt;&amp;lt;/iframe&amp;gt;[[/embed]]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile1.png]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile3.png]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile2.png]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
*Chabay, Ruth W., and Bruce A. Sherwood. Matter &amp;amp; Interactions. John Wiley &amp;amp; Sons, 2015.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30770</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30770"/>
		<updated>2017-11-30T02:57:55Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* A Computational Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot; (Ted Talk)&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
[[embed]]&lt;br /&gt;
[[/embed]]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile1.png]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile3.png]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile2.png]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
*Chabay, Ruth W., and Bruce A. Sherwood. Matter &amp;amp; Interactions. John Wiley &amp;amp; Sons, 2015.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30768</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30768"/>
		<updated>2017-11-30T02:57:09Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* A Computational Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot; (Ted Talk)&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
[[embed]]&lt;br /&gt;
&amp;lt;iframe src=&amp;quot;https://trinket.io/embed/glowscript/c24ad7936e&amp;quot; width=&amp;quot;100%&amp;quot; height=&amp;quot;356&amp;quot; frameborder=&amp;quot;0&amp;quot; marginwidth=&amp;quot;0&amp;quot; marginheight=&amp;quot;0&amp;quot; allowfullscreen&amp;gt;&amp;lt;/iframe&amp;gt;]]&lt;br /&gt;
[[/embed]]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile1.png]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile3.png]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile2.png]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
*Chabay, Ruth W., and Bruce A. Sherwood. Matter &amp;amp; Interactions. John Wiley &amp;amp; Sons, 2015.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30763</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30763"/>
		<updated>2017-11-30T02:55:22Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* A Computational Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot; (Ted Talk)&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
[[&amp;lt;iframe src=&amp;quot;https://trinket.io/embed/glowscript/c24ad7936e&amp;quot; width=&amp;quot;100%&amp;quot; height=&amp;quot;356&amp;quot; frameborder=&amp;quot;0&amp;quot; marginwidth=&amp;quot;0&amp;quot; marginheight=&amp;quot;0&amp;quot; allowfullscreen&amp;gt;&amp;lt;/iframe&amp;gt;]]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile1.png]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile3.png]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile2.png]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
*Chabay, Ruth W., and Bruce A. Sherwood. Matter &amp;amp; Interactions. John Wiley &amp;amp; Sons, 2015.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30761</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30761"/>
		<updated>2017-11-30T02:55:02Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* A Computational Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot; (Ted Talk)&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;iframe src=&amp;quot;https://trinket.io/embed/glowscript/c24ad7936e&amp;quot; width=&amp;quot;100%&amp;quot; height=&amp;quot;356&amp;quot; frameborder=&amp;quot;0&amp;quot; marginwidth=&amp;quot;0&amp;quot; marginheight=&amp;quot;0&amp;quot; allowfullscreen&amp;gt;&amp;lt;/iframe&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile1.png]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile3.png]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile2.png]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
*Chabay, Ruth W., and Bruce A. Sherwood. Matter &amp;amp; Interactions. John Wiley &amp;amp; Sons, 2015.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30745</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30745"/>
		<updated>2017-11-30T02:45:26Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* The Main Idea */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot; (Ted Talk)&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile1.png]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile3.png]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile2.png]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
*Chabay, Ruth W., and Bruce A. Sherwood. Matter &amp;amp; Interactions. John Wiley &amp;amp; Sons, 2015.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30740</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30740"/>
		<updated>2017-11-30T02:43:36Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* References */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile1.png]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile3.png]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile2.png]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
*Chabay, Ruth W., and Bruce A. Sherwood. Matter &amp;amp; Interactions. John Wiley &amp;amp; Sons, 2015.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30737</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30737"/>
		<updated>2017-11-30T02:41:44Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* Simple */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile1.png]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile3.png]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile2.png]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30736</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30736"/>
		<updated>2017-11-30T02:41:37Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* Difficult */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile2.png]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile3.png]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile2.png]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30735</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30735"/>
		<updated>2017-11-30T02:41:22Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* Simple */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile2.png]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile3.png]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile1.png]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=File:Writtenfile2.png&amp;diff=30734</id>
		<title>File:Writtenfile2.png</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=File:Writtenfile2.png&amp;diff=30734"/>
		<updated>2017-11-30T02:40:59Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30733</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30733"/>
		<updated>2017-11-30T02:40:31Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* Simple */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:writteaasdfnfile1.png]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile3.png]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile1.png]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30731</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30731"/>
		<updated>2017-11-30T02:39:18Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* Simple */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile1.png]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile3.png]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile1.png]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30729</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30729"/>
		<updated>2017-11-30T02:39:07Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* Difficult */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile2.png]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile3.png]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile1.png]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30727</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30727"/>
		<updated>2017-11-30T02:38:25Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* Simple */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile2.png]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile3.png]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:asdfasdf.jpg]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30726</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30726"/>
		<updated>2017-11-30T02:38:13Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* Middling */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:asdfsdaf.jpg]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:writtenfile3.png]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:asdfasdf.jpg]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=File:Writtenfile3.png&amp;diff=30723</id>
		<title>File:Writtenfile3.png</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=File:Writtenfile3.png&amp;diff=30723"/>
		<updated>2017-11-30T02:37:52Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=File:Writtenfile1.png&amp;diff=30721</id>
		<title>File:Writtenfile1.png</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=File:Writtenfile1.png&amp;diff=30721"/>
		<updated>2017-11-30T02:36:27Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30719</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30719"/>
		<updated>2017-11-30T02:35:49Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* Examples */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
[[File:asdfsdaf.jpg]]&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
[[File:asdfasdf.jpg]]&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
[[File:asdfasdf.jpg]]&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30710</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30710"/>
		<updated>2017-11-30T02:29:04Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* Examples */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Calculate the entropy of the system given the answer from the previously calculated omega.&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30708</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30708"/>
		<updated>2017-11-30T02:28:22Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* Examples */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the interatomic spring stiffness of copper is 28 N/m, answer the following:&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, how many joules are there in the cluster of nano-particles?&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
What is the values of omega for 4 quanta of energy in the nano-particle?&lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30694</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30694"/>
		<updated>2017-11-30T02:18:10Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* Examples */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the Young&#039;s modulus for copper (find):&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, what is the temperature when there is 8 quanta of energy in the nanoparticle?&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
At the temperature calculated above, what is the approximate specific heat (per atom)?&lt;br /&gt;
From here, compare to the already calculated specific heat of lead. &lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
Think critically, why are these values the same?&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30664</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30664"/>
		<updated>2017-11-30T01:46:13Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* A Mathematical Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:entropy123.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the Young&#039;s modulus for copper (find):&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, what is the temperature when there is 8 quanta of energy in the nanoparticle?&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
At the temperature calculated above, what is the approximate specific heat (per atom)?&lt;br /&gt;
From here, compare to the already calculated specific heat of lead. &lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=File:Entropy123.png&amp;diff=30663</id>
		<title>File:Entropy123.png</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=File:Entropy123.png&amp;diff=30663"/>
		<updated>2017-11-30T01:45:57Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30659</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30659"/>
		<updated>2017-11-30T01:44:40Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* A Mathematical Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the Young&#039;s modulus for copper (find):&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, what is the temperature when there is 8 quanta of energy in the nanoparticle?&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
At the temperature calculated above, what is the approximate specific heat (per atom)?&lt;br /&gt;
From here, compare to the already calculated specific heat of lead. &lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30658</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30658"/>
		<updated>2017-11-30T01:44:08Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* A Mathematical Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.55.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Exaasdfasdfle.jpg]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the Young&#039;s modulus for copper (find):&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, what is the temperature when there is 8 quanta of energy in the nanoparticle?&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
At the temperature calculated above, what is the approximate specific heat (per atom)?&lt;br /&gt;
From here, compare to the already calculated specific heat of lead. &lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=File:Screen_Shot_2017-11-27_at_4.55.56_PM.png&amp;diff=30656</id>
		<title>File:Screen Shot 2017-11-27 at 4.55.56 PM.png</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=File:Screen_Shot_2017-11-27_at_4.55.56_PM.png&amp;diff=30656"/>
		<updated>2017-11-30T01:43:46Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30652</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30652"/>
		<updated>2017-11-30T01:43:06Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* A Mathematical Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Easdfsadfmple.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Exaasdfasdfle.jpg]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the Young&#039;s modulus for copper (find):&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, what is the temperature when there is 8 quanta of energy in the nanoparticle?&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
At the temperature calculated above, what is the approximate specific heat (per atom)?&lt;br /&gt;
From here, compare to the already calculated specific heat of lead. &lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30649</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30649"/>
		<updated>2017-11-30T01:42:49Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* A Mathematical Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the Young&#039;s modulus for copper (find):&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, what is the temperature when there is 8 quanta of energy in the nanoparticle?&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
At the temperature calculated above, what is the approximate specific heat (per atom)?&lt;br /&gt;
From here, compare to the already calculated specific heat of lead. &lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30647</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30647"/>
		<updated>2017-11-30T01:41:54Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* A Mathematical Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:screenshot1.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the Young&#039;s modulus for copper (find):&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, what is the temperature when there is 8 quanta of energy in the nanoparticle?&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
At the temperature calculated above, what is the approximate specific heat (per atom)?&lt;br /&gt;
From here, compare to the already calculated specific heat of lead. &lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=File:Screen_Shot_2017-11-27_at_4.59.56_PM.png&amp;diff=30644</id>
		<title>File:Screen Shot 2017-11-27 at 4.59.56 PM.png</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=File:Screen_Shot_2017-11-27_at_4.59.56_PM.png&amp;diff=30644"/>
		<updated>2017-11-30T01:40:56Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30643</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30643"/>
		<updated>2017-11-30T01:40:35Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* A Mathematical Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Screen Shot 2017-11-27 at 4.59.56 PM.png]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the Young&#039;s modulus for copper (find):&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, what is the temperature when there is 8 quanta of energy in the nanoparticle?&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
At the temperature calculated above, what is the approximate specific heat (per atom)?&lt;br /&gt;
From here, compare to the already calculated specific heat of lead. &lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30375</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30375"/>
		<updated>2017-11-29T22:31:22Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* Examples */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Given the Young&#039;s modulus for copper (find):&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
For a nanoparticle consisting of 7 copper atoms, what is the temperature when there is 8 quanta of energy in the nanoparticle?&lt;br /&gt;
&lt;br /&gt;
===Middling===&lt;br /&gt;
At the temperature calculated above, what is the approximate specific heat (per atom)?&lt;br /&gt;
From here, compare to the already calculated specific heat of lead. &lt;br /&gt;
&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30367</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=30367"/>
		<updated>2017-11-29T22:24:10Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* A Mathematical Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
The goal of these formulas is to be able to calculate heat and temperature of certain objects:&lt;br /&gt;
&lt;br /&gt;
Here is the formula to calculate Einstein&#039;s model of a solid:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Be sure to show all steps in your solution and include diagrams whenever possible&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
===Middling===&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29743</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29743"/>
		<updated>2017-11-27T22:38:08Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* Connectedness */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Short Description of Topic&lt;br /&gt;
&lt;br /&gt;
==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Be sure to show all steps in your solution and include diagrams whenever possible&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
===Middling===&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
The study of entropy is pertinent to my major as an Industrial Engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29741</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29741"/>
		<updated>2017-11-27T22:37:20Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* Connectedness */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Short Description of Topic&lt;br /&gt;
&lt;br /&gt;
==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Be sure to show all steps in your solution and include diagrams whenever possible&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
===Middling===&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
&lt;br /&gt;
In my research I read that entropy is known as time&#039;s arrow, which in my opinion is one of the most powerful denotations of a physics term. Entropy is a fundamental law that makes the universe tick and it is such a powerful force that it will (possibly) cause the eventual end of the entire universe. Since entropy is always increases, over the expanse of an obscene amount of time the universe due to entropy will eventually suffer a &amp;quot;heat death&amp;quot; and cease to exist entirely. This is merely a scientific hypothesis, and though it may be gloom, an Asimov supercomputer Multivac may finally solve the Last Question and reboot the entire universe again. &lt;br /&gt;
&lt;br /&gt;
It is pertinent to my major as an industrial engineer as the whole idea of entropy is statistical thermodynamics. This is very similar to Industrial Engineering as it is essentially a statistical business major. Though the odds are unlikely that entropy will be directly used in the day of the life of an Industrial Engineer, the same distributions and concepts of probability are universal and carry over regardless of whether the example is of thermodynamic or business. &lt;br /&gt;
&lt;br /&gt;
My understanding of quantum computers is no more than a couple of wikipedia articles and youtube videos, but I assume anything along the fields of quantum mechanics, which definitely relates to entropy, is important in making the chips to withstand intense heat transfers, etc.&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29734</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29734"/>
		<updated>2017-11-27T22:25:03Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* History */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Short Description of Topic&lt;br /&gt;
&lt;br /&gt;
==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Be sure to show all steps in your solution and include diagrams whenever possible&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
===Middling===&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
#How is this topic connected to something that you are interested in?&lt;br /&gt;
#How is it connected to your major?&lt;br /&gt;
#Is there an interesting industrial application?&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeled entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29733</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29733"/>
		<updated>2017-11-27T22:24:42Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* See also */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Short Description of Topic&lt;br /&gt;
&lt;br /&gt;
==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Be sure to show all steps in your solution and include diagrams whenever possible&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
===Middling===&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
#How is this topic connected to something that you are interested in?&lt;br /&gt;
#How is it connected to your major?&lt;br /&gt;
#Is there an interesting industrial application?&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeling entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29732</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29732"/>
		<updated>2017-11-27T22:24:01Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* External links */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Short Description of Topic&lt;br /&gt;
&lt;br /&gt;
==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Be sure to show all steps in your solution and include diagrams whenever possible&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
===Middling===&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
#How is this topic connected to something that you are interested in?&lt;br /&gt;
#How is it connected to your major?&lt;br /&gt;
#Is there an interesting industrial application?&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeling entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Are there related topics or categories in this wiki resource for the curious reader to explore?  How does this topic fit into that context?&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29731</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29731"/>
		<updated>2017-11-27T22:23:51Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* Further reading */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Short Description of Topic&lt;br /&gt;
&lt;br /&gt;
==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Be sure to show all steps in your solution and include diagrams whenever possible&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
===Middling===&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
#How is this topic connected to something that you are interested in?&lt;br /&gt;
#How is it connected to your major?&lt;br /&gt;
#Is there an interesting industrial application?&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeling entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Are there related topics or categories in this wiki resource for the curious reader to explore?  How does this topic fit into that context?&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Internet resources on this topic&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29730</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29730"/>
		<updated>2017-11-27T22:23:35Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* Further reading */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Short Description of Topic&lt;br /&gt;
&lt;br /&gt;
==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Be sure to show all steps in your solution and include diagrams whenever possible&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
===Middling===&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
#How is this topic connected to something that you are interested in?&lt;br /&gt;
#How is it connected to your major?&lt;br /&gt;
#Is there an interesting industrial application?&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeling entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Are there related topics or categories in this wiki resource for the curious reader to explore?  How does this topic fit into that context?&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
Great TED-ED on the subject:&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Internet resources on this topic&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29729</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29729"/>
		<updated>2017-11-27T22:23:09Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* References */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Short Description of Topic&lt;br /&gt;
&lt;br /&gt;
==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Be sure to show all steps in your solution and include diagrams whenever possible&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
===Middling===&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
#How is this topic connected to something that you are interested in?&lt;br /&gt;
#How is it connected to your major?&lt;br /&gt;
#Is there an interesting industrial application?&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeling entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Are there related topics or categories in this wiki resource for the curious reader to explore?  How does this topic fit into that context?&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
Books, Articles or other print media on this topic&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Internet resources on this topic&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
*http://gallica.bnf.fr/ark:/12148/bpt6k152107/f369.table&lt;br /&gt;
*http://www.panspermia.org/seconlaw.htm&lt;br /&gt;
*https://ed.ted.com/lessons/what-is-entropy-jeff-phillips&lt;br /&gt;
*https://www.merriam-webster.com/dictionary/entropy&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29725</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29725"/>
		<updated>2017-11-27T22:21:31Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* History */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Short Description of Topic&lt;br /&gt;
&lt;br /&gt;
==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Be sure to show all steps in your solution and include diagrams whenever possible&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
===Middling===&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
#How is this topic connected to something that you are interested in?&lt;br /&gt;
#How is it connected to your major?&lt;br /&gt;
#Is there an interesting industrial application?&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means &amp;quot;transformation content.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeling entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy. &lt;br /&gt;
&lt;br /&gt;
In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Are there related topics or categories in this wiki resource for the curious reader to explore?  How does this topic fit into that context?&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
Books, Articles or other print media on this topic&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Internet resources on this topic&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
This section contains the the references you used while writing this page&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29723</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29723"/>
		<updated>2017-11-27T22:02:23Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* A Mathematical Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Short Description of Topic&lt;br /&gt;
&lt;br /&gt;
==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
From this you can directly calculate Entropy (S):&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
Where (The Boltzmann constant) Kb = 1.38 e -23&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Be sure to show all steps in your solution and include diagrams whenever possible&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
===Middling===&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
#How is this topic connected to something that you are interested in?&lt;br /&gt;
#How is it connected to your major?&lt;br /&gt;
#Is there an interesting industrial application?&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
Put this idea in historical context. Give the reader the Who, What, When, Where, and Why.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Are there related topics or categories in this wiki resource for the curious reader to explore?  How does this topic fit into that context?&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
Books, Articles or other print media on this topic&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Internet resources on this topic&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
This section contains the the references you used while writing this page&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29720</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29720"/>
		<updated>2017-11-27T21:58:12Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* A Mathematical Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Short Description of Topic&lt;br /&gt;
&lt;br /&gt;
==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Be sure to show all steps in your solution and include diagrams whenever possible&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
===Middling===&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
#How is this topic connected to something that you are interested in?&lt;br /&gt;
#How is it connected to your major?&lt;br /&gt;
#Is there an interesting industrial application?&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
Put this idea in historical context. Give the reader the Who, What, When, Where, and Why.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Are there related topics or categories in this wiki resource for the curious reader to explore?  How does this topic fit into that context?&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
Books, Articles or other print media on this topic&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Internet resources on this topic&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
This section contains the the references you used while writing this page&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29718</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29718"/>
		<updated>2017-11-27T21:53:48Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* The Main Idea */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Short Description of Topic&lt;br /&gt;
&lt;br /&gt;
==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
What are the mathematical equations that allow us to model this topic.  For example &amp;lt;math&amp;gt;{\frac{d\vec{p}}{dt}}_{system} = \vec{F}_{net}&amp;lt;/math&amp;gt; where &#039;&#039;&#039;p&#039;&#039;&#039; is the momentum of the system and &#039;&#039;&#039;F&#039;&#039;&#039; is the net force from the surroundings.&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Be sure to show all steps in your solution and include diagrams whenever possible&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
===Middling===&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
#How is this topic connected to something that you are interested in?&lt;br /&gt;
#How is it connected to your major?&lt;br /&gt;
#Is there an interesting industrial application?&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
Put this idea in historical context. Give the reader the Who, What, When, Where, and Why.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Are there related topics or categories in this wiki resource for the curious reader to explore?  How does this topic fit into that context?&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
Books, Articles or other print media on this topic&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Internet resources on this topic&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
This section contains the the references you used while writing this page&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29717</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29717"/>
		<updated>2017-11-27T21:53:36Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* The Main Idea */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Short Description of Topic&lt;br /&gt;
&lt;br /&gt;
==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
What are the mathematical equations that allow us to model this topic.  For example &amp;lt;math&amp;gt;{\frac{d\vec{p}}{dt}}_{system} = \vec{F}_{net}&amp;lt;/math&amp;gt; where &#039;&#039;&#039;p&#039;&#039;&#039; is the momentum of the system and &#039;&#039;&#039;F&#039;&#039;&#039; is the net force from the surroundings.&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Be sure to show all steps in your solution and include diagrams whenever possible&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
===Middling===&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
#How is this topic connected to something that you are interested in?&lt;br /&gt;
#How is it connected to your major?&lt;br /&gt;
#Is there an interesting industrial application?&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
Put this idea in historical context. Give the reader the Who, What, When, Where, and Why.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Are there related topics or categories in this wiki resource for the curious reader to explore?  How does this topic fit into that context?&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
Books, Articles or other print media on this topic&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Internet resources on this topic&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
This section contains the the references you used while writing this page&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
	<entry>
		<id>http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29716</id>
		<title>Entropy</title>
		<link rel="alternate" type="text/html" href="http://www.physicsbook.gatech.edu/index.php?title=Entropy&amp;diff=29716"/>
		<updated>2017-11-27T21:52:35Z</updated>

		<summary type="html">&lt;p&gt;Ctyler9: /* The Main Idea */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Short Description of Topic&lt;br /&gt;
&lt;br /&gt;
==The Main Idea==&lt;br /&gt;
&lt;br /&gt;
Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is &amp;quot;the degree of disorder or randomness in the system&amp;quot; (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes &amp;quot;the direct measure of each energy configuration&#039;s probability.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
===A Mathematical Model===&lt;br /&gt;
&lt;br /&gt;
What are the mathematical equations that allow us to model this topic.  For example &amp;lt;math&amp;gt;{\frac{d\vec{p}}{dt}}_{system} = \vec{F}_{net}&amp;lt;/math&amp;gt; where &#039;&#039;&#039;p&#039;&#039;&#039; is the momentum of the system and &#039;&#039;&#039;F&#039;&#039;&#039; is the net force from the surroundings.&lt;br /&gt;
&lt;br /&gt;
===A Computational Model===&lt;br /&gt;
&lt;br /&gt;
How do we visualize or predict using this topic. Consider embedding some vpython code here [https://trinket.io/glowscript/31d0f9ad9e Teach hands-on with GlowScript]&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
&lt;br /&gt;
Be sure to show all steps in your solution and include diagrams whenever possible&lt;br /&gt;
&lt;br /&gt;
===Simple===&lt;br /&gt;
===Middling===&lt;br /&gt;
===Difficult===&lt;br /&gt;
&lt;br /&gt;
==Connectedness==&lt;br /&gt;
#How is this topic connected to something that you are interested in?&lt;br /&gt;
#How is it connected to your major?&lt;br /&gt;
#Is there an interesting industrial application?&lt;br /&gt;
&lt;br /&gt;
==History==&lt;br /&gt;
&lt;br /&gt;
Put this idea in historical context. Give the reader the Who, What, When, Where, and Why.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
Are there related topics or categories in this wiki resource for the curious reader to explore?  How does this topic fit into that context?&lt;br /&gt;
&lt;br /&gt;
===Further reading===&lt;br /&gt;
&lt;br /&gt;
Books, Articles or other print media on this topic&lt;br /&gt;
&lt;br /&gt;
===External links===&lt;br /&gt;
&lt;br /&gt;
Internet resources on this topic&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
&lt;br /&gt;
This section contains the the references you used while writing this page&lt;br /&gt;
&lt;br /&gt;
[[Category:Which Category did you place this in?]]&lt;/div&gt;</summary>
		<author><name>Ctyler9</name></author>
	</entry>
</feed>