Jump to content

Introduction to entropy: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
some cleanup and correction
 
(184 intermediate revisions by 76 users not shown)
Line 1: Line 1:
{{short description|Non-technical introduction to entropy}}
{{Thermodynamics|cTopic=[[List of thermodynamic properties|System properties]]}}
{{Introductory article|Entropy}}
{{Introductory article|Entropy}}
{{Multiple issues|
The idea of "[[irreversibility]]" is central to the understanding of '''[[entropy]]'''. Everyone has an intuitive understanding of irreversibility ([[Dissipation|a dissipative process]]) - if one watches a movie of everyday life running forward and in reverse, it is easy to distinguish between the two. The movie running in reverse shows impossible things happening - water jumping out of a glass into a pitcher above it, smoke going down a chimney, water in a glass freezing to form [[ice cube]]s, crashed cars reassembling themselves, and so on. The intuitive meaning of expressions such as "you can't unscramble an egg", "don't cry over spilled milk" or "you can't take the cream out of the coffee" is that these are irreversible processes. There is a direction in time by which spilled milk does not go back into the glass.
{{technical|date=July 2018}}
{{Disputed|date=November 2020}}
}}
{{Thermodynamics|cTopic=[[List of thermodynamic properties|System properties]]}}
In [[thermodynamics]], entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, cream and coffee can be mixed together, but cannot be "unmixed"; a piece of wood can be burned, but cannot be "unburned". The word 'entropy' has entered popular usage to refer to a lack of order or predictability, or of a gradual decline into disorder.<ref name="lexico">{{cite web |title=Definition of entropy in English |url=https://www.lexico.com/en/definition/entropy |archive-url=https://web.archive.org/web/20190711005908/https://www.lexico.com/en/definition/entropy |url-status=dead |archive-date=July 11, 2019 |website=Lexico Powered By Oxford |access-date=18 November 2020}}</ref> A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.


If a movie that shows coffee being mixed or wood being burned is played in reverse, it would depict processes highly improbable in reality. Mixing coffee and burning wood are "irreversible". Irreversibility is described by a law of nature known as the [[second law of thermodynamics]], which states that in an isolated system (a system not connected to any other system) which is undergoing change, entropy increases over time.<ref>Theoretically, coffee can be "unmixed" and wood can be "unburned", but this would need a "machine" that would generate more entropy than was lost in the original process. This is why the second law only holds for isolated system which means they cannot be connected to some external "machine".</ref>
In thermodynamics, one says that the "forward" processes – pouring water from a pitcher, smoke going up a chimney, etc. – are "irreversible": they cannot happen in reverse, even though, on a microscopic level, no [[laws of physics]] would be violated if they did.{{cn|date=May 2015}} All real physical processes involving systems in everyday life, with many atoms or molecules, are irreversible. <u>For an irreversible process in an [[isolated system]], the thermodynamic state variable known as entropy is always increasing.</u> The reason that the movie in reverse is so easily recognized is because it shows processes for which entropy is decreasing, which is physically impossible. In everyday life, there may be processes in which the increase of entropy is practically unobservable, almost zero. In these cases, a movie of the process run in reverse will not seem unlikely. For example, in a 1-second video of the collision of two billiard balls, it will be hard to distinguish the forward and the backward case, because the increase of entropy during that time is relatively small. In thermodynamics, one says that this process is practically "reversible", with an entropy increase that is practically zero. The statement of the fact that the entropy of the Universe never decreases is found in the [[second law of thermodynamics]].


Entropy does not increase indefinitely. A body of matter and radiation eventually will reach an unchanging state, with no detectable flows, and is then said to be in a state of [[thermodynamic equilibrium]]. Thermodynamic entropy has a definite value for such a body and is at its maximum value. When bodies of matter or radiation, initially in their own states of internal thermodynamic equilibrium, are brought together so as to intimately interact and reach a new joint equilibrium, then their total entropy increases. For example, a glass of warm water with an ice cube in it will have a lower entropy than that same system some time later when the ice has melted leaving a glass of cool water. Such processes are irreversible: A glass of cool water will not [[Spontaneous process|spontaneously]] turn into a glass of warm water with an ice cube in it. Some processes in nature are almost reversible. For example, the orbiting of the planets around the Sun may be thought of as practically reversible: A movie of the planets orbiting the Sun which is run in reverse would not appear to be impossible.
In a [[physical system]], '''entropy''' provides a measure of the amount of thermal energy that ''cannot'' be used to do [[Work (thermodynamics)|work]]. In some other definitions of entropy{{what|date=September 2014}}, it is a measure of how evenly energy (or some analogous property) is distributed in a system. ''Work'' and ''[[heat]]'' are determined by a process that a system undergoes, and only occur at the boundary of a system. ''Entropy'' is a function of the state of a system, and has a value determined by the state variables of the system.{{What|date=September 2014}}


While the second law, and thermodynamics in general, accurately predicts the intimate interactions of complex physical systems, scientists are not content with simply knowing how a system behaves, they also want to know ''why'' it behaves the way it does. The question of why entropy increases until equilibrium is reached was answered in 1877 by physicist [[Ludwig Boltzmann]]. The theory developed by Boltzmann and others, is known as [[statistical mechanics]]. Statistical mechanics explains thermodynamics in terms of the statistical behavior of the atoms and molecules which make up the system. The theory not only explains thermodynamics, but also a host of other phenomena which are outside the scope of thermodynamics.
The concept of entropy is central to the second law of thermodynamics. The second law determines which physical processes can occur. For example, it predicts that the flow of heat from a region of high temperature to a region of low temperature is a [[spontaneous process]] – it can proceed along by itself without needing any extra external energy. When this process occurs, the hot region becomes cooler and the cold region becomes warmer. Heat is distributed more evenly throughout the system and the system's ability to do work has decreased because the temperature difference between the hot region and the cold region has decreased. Referring back to our definition of entropy, we can see that the entropy of this system has increased. Thus, the second law of thermodynamics can be stated to say that the entropy of an isolated system always increases, and such processes which increase entropy can occur spontaneously. The entropy of a system increases as its components have the range of their momentum and/or position increased.


== Explanation ==
The term ''entropy'' was coined in 1865 by the German physicist [[Rudolf Clausius]], from the Greek words ''en-'', "in", and ''trope'' "a turning", in analogy with ''[[energy]]''.<ref>{{Cite web|title=etymonline.com:entropy|url=http://www.etymonline.com/index.php?search=entropy&searchmode=none|accessdate=2009-06-15}}</ref>


=== Thermodynamic entropy ===
==Explanation==
The concept of thermodynamic entropy arises from the [[second law of thermodynamics]]. By this law of entropy increase it quantifies the reduction in the capacity of a system for change, for example heat always flows from a region of higher temperature to one with lower temperature until temperature becomes uniform or determines whether a thermodynamic process may occur.


The concept of [[Entropy (classical thermodynamics)|thermodynamic entropy]] arises from the [[second law of thermodynamics]]. This law of entropy increase quantifies the reduction in the capacity of an isolated compound thermodynamic system to do [[Work (thermodynamics)|thermodynamic work]] on its surroundings, or indicates whether a thermodynamic process may occur. For example, whenever there is a suitable pathway, heat spontaneously flows from a hotter body to a colder one.
Entropy is calculated in two ways, the first is the '''entropy change''' (ΔS) to a system containing a sub-system which undergoes heat transfer to its surroundings (inside the system of interest). It is based on the [[macroscopic]] relationship between [[heat flow]] into the sub-system and the temperature at which it occurs summed over the boundary of that sub-system. The second calculates the '''absolute entropy''' (S) of a system based on the microscopic behaviour of its individual particles. This is based on the [[natural logarithm]] of the number of [[Microstate (statistical mechanics)|microstates]] possible in a particular [[Microstate (statistical mechanics)|macrostate]] (W or Ω) called the thermodynamic probability. Roughly, it gives the probability of the system's being in that state. In this sense it effectively defines entropy independently from its effects due to changes which may involve heat, mechanical, electrical, chemical energies etc. but also includes logical states such as information.


Thermodynamic entropy is measured as a change in entropy (<math>\Delta S</math>) to a system containing a sub-system which undergoes heat transfer to its surroundings (inside the system of interest). It is based on the [[macroscopic]] relationship between [[heat flow]] into the sub-system and the temperature at which it occurs summed over the boundary of that sub-system.
Following the formalism of Clausius, the first calculation can be mathematically stated as:<ref>I. Klotz, R. Rosenberg, ''Chemical Thermodynamics - Basic Concepts and Methods'', 7th ed., Wiley (2008), p. 125</ref>

Following the [[Clausius theorem|formalism of Clausius]], the basic calculation can be mathematically stated as:<ref>I. Klotz, R. Rosenberg, ''Chemical Thermodynamics – Basic Concepts and Methods'', 7th ed., Wiley (2008), p. 125</ref>
: <math>{\rm \delta}S = \frac{{\rm \delta}q}{T}.</math>
: <math>{\rm \delta}S = \frac{{\rm \delta}q}{T}.</math>

Where δ''S'' is the increase or decrease in entropy, δ''q'' is the heat added to the system or subtracted from it, and ''T'' is temperature. The equal sign indicates that the change is reversible {{Why|date=February 2013}}. If the temperature is allowed to vary, the equation must be [[integral|integrated]] over the temperature path. This calculation of entropy change does not allow the determination of absolute value, only differences. In this context, the Second Law of Thermodynamics may be stated that for heat transferred over any valid process for any system, whether isolated or not,
where <math>\delta S</math> is the increase or decrease in entropy, <math>\delta q</math> is the heat added to the system or subtracted from it, and <math>T</math> is temperature. The 'equals' sign and the symbol <math>\delta</math> imply that the heat transfer should be so small and slow that it scarcely changes the temperature <math>T</math>.

If the temperature is allowed to vary, the equation must be [[integral|integrated]] over the temperature path. This calculation of entropy change does not allow the determination of absolute value, only differences. In this context, the Second Law of Thermodynamics may be stated that for heat transferred over any valid process for any system, whether isolated or not,
: <math>{{\rm \delta}S} \ge {\frac{{\rm \delta}q}{T}}.</math>
: <math>{{\rm \delta}S} \ge {\frac{{\rm \delta}q}{T}}.</math>


According to the [[first law of thermodynamics]], which deals with the [[conservation of energy]], the loss <math>\delta q</math> of heat will result in a decrease in the [[internal energy]] of the [[thermodynamic system]]. Thermodynamic entropy provides a comparative measure of the amount of decrease in internal energy and the corresponding increase in internal energy of the surroundings at a given temperature. In many cases, a visualization of the second law is that energy of all types changes from being localized to becoming dispersed or spread out, if it is not hindered from doing so. When applicable, entropy increase is the quantitative measure of that kind of a spontaneous process: how much energy has been effectively lost or become unavailable, by dispersing itself, or spreading itself out, as assessed at a specific temperature. For this assessment, when the temperature is higher, the amount of energy dispersed is assessed as 'costing' proportionately less. This is because a hotter body is generally more able to do thermodynamic work, other factors, such as internal energy, being equal. This is why a steam engine has a hot firebox.
The second calculation defines entropy in absolute terms and comes from [[statistical mechanics]]. The entropy of a particular [[Microstate (statistical mechanics)|macrostate]] is defined to be [[Boltzmann's constant]] times the [[natural logarithm]] of the number of microstates corresponding to that macrostate, or mathematically
:<math>S = k_{B} \ln \Omega,\!</math>
Where ''S'' is the entropy, ''k<sub>B</sub>'' is Boltzmann's constant, and &Omega; is the number of microstates.


The second law of thermodynamics deals only with changes of entropy (<math>\Delta S</math>). The absolute entropy (S) of a system may be determined using the [[third law of thermodynamics]], which specifies that the entropy of all perfectly crystalline substances is zero at the [[absolute zero]] of temperature.<ref>{{cite book |last1=Atkins |first1=Peter |last2=de Paula |first2=Julio |title=Atkins' Physical Chemistry |date=2006 |publisher=W. H. Freeman |isbn=0-7167-8759-8 |pages=92–94 |edition=8th}}</ref> The entropy at another temperature is then equal to the increase in entropy on heating the system reversibly from absolute zero to the temperature of interest.<ref>{{cite book |last1=Laidler |first1=Keith J. |last2=Meiser |first2=John H. |title=Physical Chemistry |date=1982 |publisher=Benjamin/Cummings |isbn=0-8053-5682-7 |page=110 |quote=Entropies can then be determined at other temperatures, by considering a series of reversible processes by which the temperature is raised from the absolute zero to the temperature in question.}}</ref>
The macrostate of a system is what we know about the system, for example the [[temperature]], [[pressure]], and [[volume (thermodynamics)|volume]] of a gas in a box. For each set of values of temperature, pressure, and volume there are many arrangements of molecules which result in those values. The number of arrangements of molecules which could result in the same values for temperature, pressure and volume is the number of microstates.{{confusing|date=April 2015}}


=== Statistical mechanics and information entropy ===
The concept of [[energy]] is related to the [[first law of thermodynamics]], which deals with the [[conservation of energy]] and under which the loss in heat will result in a decrease in the [[internal energy]] of the [[thermodynamic system]]. Thermodynamic entropy provides a comparative measure of the amount of this decrease in internal energy of the system and the corresponding increase in internal energy of the surroundings at a given temperature. A simple and more concrete visualization of the second law is that energy of all types changes from being localized to becoming dispersed or spread out, if it is not hindered from doing so. Entropy change is the quantitative measure of that kind of a spontaneous process: how much energy has flowed or how widely it has become spread out at a specific temperature.


Thermodynamic entropy bears a close relationship to the concept of [[information entropy]] (''H''). Information entropy is a measure of the "spread" of a probability density or probability mass function. Thermodynamics makes no assumptions about the atomistic nature of matter, but when matter is viewed in this way, as a collection of particles constantly moving and exchanging energy with each other, and which may be described in a probabilistic manner, information theory may be successfully applied to explain the results of thermodynamics. The resulting theory is known as [[statistical mechanics]].
The concept of entropy has been developed to describe any of several phenomena, depending on the field and the context in which it is being used. [[Information entropy]] takes the mathematical concepts of [[statistical thermodynamics]] into areas of [[probability theory]] unconnected with heat and energy.


An important concept in statistical mechanics is the idea of the [[macrostate|microstate and the macrostate]] of a system. If we have a container of gas, for example, and we know the position and velocity of every molecule in that system, then we know the microstate of that system. If we only know the thermodynamic description of that system, the pressure, volume, temperature, and/or the entropy, then we know the macrostate of that system. Boltzmann realized that there are many different microstates that can yield the same macrostate, and, because the particles are colliding with each other and changing their velocities and positions, the microstate of the gas is always changing. But if the gas is in equilibrium, there seems to be no change in its macroscopic behavior: No changes in pressure, temperature, etc. Statistical mechanics relates the thermodynamic entropy of a macrostate to the number of microstates that could yield that macrostate. In statistical mechanics, the entropy of the system is given by Ludwig Boltzmann's equation:
[[Image:Ice water.jpg|thumb|Ice melting provides an example of entropy ''increasing'']]
: <math>S=k_\text{B}\,\ln W</math>
where ''S'' is the thermodynamic entropy, ''W'' is the number of microstates that may yield the macrostate, and <math>k_\text{B}</math> is the [[Boltzmann constant]]. The [[natural logarithm]] of the number of microstates (<math>\ln W</math>) is known as the [[information entropy]] of the system. This can be illustrated by a simple example:


If you flip two coins, you can have four different results. If ''H'' is heads and ''T'' is tails, we can have (''H'',''H''), (''H'',''T''), (''T'',''H''), and (''T'',''T''). We can call each of these a "microstate" for which we know exactly the results of the process. But what if we have less information? Suppose we only know the total number of heads?. This can be either 0, 1, or 2. We can call these "macrostates". Only microstate (''T'',''T'') will give macrostate zero, (''H'',''T'') and (''T'',''H'') will give macrostate 1, and only (''H'',''H'') will give macrostate 2. So we can say that the information entropy of macrostates 0 and 2 are ln(1) which is zero, but the information entropy of macrostate 1 is ln(2) which is about 0.69. Of all the microstates, macrostate 1 accounts for half of them.
==Example of increasing entropy==
{{Main|Disgregation}}
Ice melting provides an example in which entropy increases in a small system, a thermodynamic system consisting of the surroundings (the warm room) and the entity of glass container, ice, water which has been allowed to reach [[thermodynamic equilibrium]] at the melting temperature of ice. In this system, some [[heat]] (''δQ'') from the warmer surroundings at 298 K (77&nbsp;°F, 25&nbsp;°C) transfers to the cooler system of ice and water at its constant temperature (''T'') of 273 K (32&nbsp;°F, 0&nbsp;°C), the melting temperature of ice. The entropy of the system, which is ''δQ/T'', increases by ''δQ/273K''. The heat ''δQ'' for this process is the energy required to change water from the solid state to the liquid state, and is called the [[enthalpy of fusion]], i.e. ''ΔH'' for ice fusion.


It turns out that if you flip a large number of coins, the macrostates at or near half heads and half tails accounts for almost all of the microstates. In other words, for a million coins, you can be fairly sure that about half will be heads and half tails. The macrostates around a 50–50 ratio of heads to tails will be the "equilibrium" macrostate. A real physical system in equilibrium has a huge number of possible microstates and almost all of them are the equilibrium macrostate, and that is the macrostate you will almost certainly see if you wait long enough. In the coin example, if you start out with a very unlikely macrostate (like all heads, for example with zero entropy) and begin flipping one coin at a time, the entropy of the macrostate will start increasing, just as thermodynamic entropy does, and after a while, the coins will most likely be at or near that 50–50 macrostate, which has the greatest information entropy – the equilibrium entropy.
It is important to realize that the entropy of the surrounding room decreases less than the entropy of the ice and water increases: the room temperature of 298 K is larger than 273 K and therefore the ratio, (entropy change), of ''δQ/298K'' for the surroundings is smaller than the ratio (entropy change), of ''δQ/273K'' for the ice and water system. This is always true in spontaneous events in a thermodynamic system and it shows the predictive importance of entropy: the final net entropy after such an event is always greater than was the initial entropy.


The macrostate of a system is what we know about the system, for example the [[temperature]], [[pressure]], and [[volume (thermodynamics)|volume]] of a gas in a box. For each set of values of temperature, pressure, and volume there are many arrangements of molecules which result in those values. The number of arrangements of molecules which could result in the same values for temperature, pressure and volume is the number of microstates.
As the temperature of the cool water rises to that of the room and the room further cools imperceptibly, the sum of the ''δQ/T'' over the continuous range, “at many increments”, in the initially cool to finally warm water can be found by calculus. The entire miniature ‘universe’, i.e. this thermodynamic system, has increased in entropy. Energy has spontaneously become more dispersed and spread out in that ‘universe’ than when the glass of ice and water was introduced and became a 'system' within it.


The concept of information entropy has been developed to describe any of several phenomena, depending on the field and the context in which it is being used. When it is applied to the problem of a large number of interacting particles, along with some other constraints, like the conservation of energy, and the assumption that all microstates are equally likely, the resultant theory of statistical mechanics is extremely successful in explaining the [[laws of thermodynamics]].
==Origins and uses==
Originally, entropy was named to describe the "waste heat," or more accurately, energy loss, from heat engines and other mechanical devices which could never run with 100% efficiency in converting energy into work. Later, the term came to acquire several additional descriptions, as more was understood about the behavior of molecules on the microscopic level. In the late 19th century, the word "disorder" was used by [[Ludwig Boltzmann]] in developing [[Entropy (statistical views)|statistical views of entropy]] using [[probability theory]] to describe the increased molecular movement on the microscopic level. That was before quantum behavior came to be better understood by [[Werner Heisenberg]] and those who followed. Descriptions of thermodynamic (heat) entropy on the microscopic level are found in statistical thermodynamics and [[statistical mechanics]].


[[Image:Ice water.jpg|thumb|Ice melting provides an example of entropy ''increasing''.]]
For most of the 20th century, textbooks tended to describe entropy as "disorder", following Boltzmann's early conceptualisation of the [[kinetic|"motional" (i.e. kinetic) energy]] of molecules. More recently, there has been a trend in chemistry and physics textbooks to describe [[Entropy (energy dispersal)|entropy as energy dispersal]].<ref>[http://entropysite.oxy.edu/ welcome to entropysite.]</ref> Entropy can also involve the dispersal of particles, which are themselves energetic. Thus there are instances where both particles and energy disperse at different rates when substances are mixed together.


== Example of increasing entropy ==
The mathematics developed in statistical thermodynamics were found to be applicable in other disciplines. In particular, information sciences developed the concept of [[information entropy]] where a constant replaces the temperature which is inherent in thermodynamic entropy.
{{Main article|Disgregation}}
Ice melting provides an example in which entropy increases in a small system, a thermodynamic system consisting of the surroundings (the warm room) and the entity of glass container, ice and water which has been allowed to reach [[thermodynamic equilibrium]] at the melting temperature of ice. In this system, some [[heat]] (''δQ'') from the warmer surroundings at 298&nbsp;K (25&nbsp;°C; 77&nbsp;°F) transfers to the cooler system of ice and water at its constant temperature (''T'') of 273&nbsp;K (0&nbsp;°C; 32&nbsp;°F), the melting temperature of ice. The entropy of the system, which is {{sfrac|δ''Q''|''T''}}, increases by {{sfrac|δ''Q''|273 K}}. The heat δ''Q'' for this process is the energy required to change water from the solid state to the liquid state, and is called the [[enthalpy of fusion]], i.e. Δ''H'' for ice fusion.


The entropy of the surrounding room decreases less than the entropy of the ice and water increases: the room temperature of 298&nbsp;K is larger than 273&nbsp;K and therefore the ratio, (entropy change), of {{sfrac|δ''Q''|298 K}} for the surroundings is smaller than the ratio (entropy change), of {{sfrac|δ''Q''|273 K}} for the ice and water system. This is always true in spontaneous events in a thermodynamic system and it shows the predictive importance of entropy: the final net entropy after such an event is always greater than was the initial entropy.
==Heat and entropy==
At a microscopic level, [[kinetic energy]] of molecules is responsible for the [[temperature]] of a substance or a system. “Heat” is the kinetic energy of molecules being transferred: when motional energy is transferred from hotter surroundings to a cooler system, faster-moving molecules in the surroundings collide with the walls of the system which [[energy transfer|transfers]] some of their energy to the molecules of the system and makes them move faster.


As the temperature of the cool water rises to that of the room and the room further cools imperceptibly, the sum of the {{sfrac|δ''Q''|''T''}} over the continuous range, "at many increments", in the initially cool to finally warm water can be found by calculus. The entire miniature 'universe', i.e. this thermodynamic system, has increased in entropy. Energy has spontaneously become more dispersed and spread out in that 'universe' than when the glass of ice and water was introduced and became a 'system' within it.
* Molecules in a [[gas]] like [[nitrogen]] at room temperature at any instant are moving at an average speed of nearly 500 miles per hour ([[orders of magnitude|210&nbsp;m/s]]), repeatedly colliding and therefore exchanging energy so that their individual speeds are always changing. Assuming an [[ideal gas|ideal-gas]] model, average kinetic energy increases [[linear correlation|linearly]] with temperature, so the average speed increases as the square root of temperature.
** Thus motional molecular energy (‘heat energy’) from hotter surroundings, like faster-moving molecules in a [[flame]] or violently vibrating iron atoms in a hot plate, will melt or boil a substance (the system) at the temperature of its melting or boiling point. That amount of motional energy from the surroundings that is required for melting or boiling is called the phase-change energy, specifically the enthalpy of fusion or of vaporization, respectively. This phase-change energy breaks bonds between the molecules in the system (not chemical bonds inside the molecules that hold the atoms together) rather than contributing to the motional energy and making the molecules move any faster – so it does not raise the temperature, but instead enables the molecules to break free to move as a liquid or as a vapor.
** In terms of energy, when a solid becomes a liquid or a vapor, motional energy coming from the surroundings is changed to ‘potential energy‘ in the substance ([[phase transition|phase change]] energy, which is released back to the surroundings when the surroundings become cooler than the substance's boiling or melting temperature, respectively). Phase-change energy increases the entropy of a substance or system because it is energy that must be spread out in the system from the surroundings so that the substance can exist as a liquid or vapor at a temperature above its melting or boiling point. When this process occurs in a 'universe' that consists of the surroundings plus the system, the total energy of the 'universe' becomes more dispersed or spread out as part of the greater energy that was only in the hotter surroundings transfers so that some is in the cooler system. This energy dispersal increases the entropy of the 'universe'.


== Origins and uses ==
The important overall principle is that ''”Energy of all types changes from being localized to becoming dispersed or spread out, if not hindered from doing so. Entropy (or better, entropy change) is the quantitative measure of that kind of a spontaneous process: how much energy has been transferred/T or how widely it has become spread out at a specific temperature."''
Originally, entropy was named to describe the "waste heat", or more accurately, energy loss, from heat engines and other mechanical devices which could never run with 100% efficiency in converting energy into work. Later, the term came to acquire several additional descriptions, as more was understood about the behavior of molecules on the microscopic level. In the late 19th century, the word "disorder" was used by [[Ludwig Boltzmann]] in developing [[Entropy (statistical views)|statistical views of entropy]] using [[probability theory]] to describe the increased molecular movement on the microscopic level. That was before quantum behavior came to be better understood by [[Werner Heisenberg]] and those who followed. Descriptions of thermodynamic (heat) entropy on the microscopic level are found in statistical thermodynamics and [[statistical mechanics]].


For most of the 20th century, textbooks tended to describe entropy as "disorder", following Boltzmann's early conceptualisation of the [[kinetic energy|"motional" (i.e. kinetic) energy]] of molecules. More recently, there has been a trend in chemistry and physics textbooks to describe [[Entropy (energy dispersal)|entropy as energy dispersal]].<ref name=Lambert>[http://franklambert.net/entropysite.com/ Entropy Sites — A Guide] Content selected by [[Frank L. Lambert]]</ref> Entropy can also involve the dispersal of particles, which are themselves energetic. Thus there are instances where both particles and energy disperse at different rates when substances are mixed together.
===Classical calculation of entropy===
When entropy was first defined and used in 1865 the very existence of atoms was still controversial and there was no concept that temperature was due to the motional energy of molecules or that “heat” was actually the transferring of that motional molecular energy from one place to another. Entropy change, <math>\Delta S</math>, was described in macroscopic terms that could be directly measured, such as volume, temperature, or pressure. However, today the classical equation of entropy, <math>\Delta S = \frac{q_{rev}}{T} </math> can be explained, part by part, in modern terms describing how molecules are responsible for what is happening:


The mathematics developed in statistical thermodynamics were found to be applicable in other disciplines. In particular, information sciences developed the concept of [[information entropy]], which lacks the Boltzmann constant inherent in thermodynamic entropy.
* <math>\Delta S</math> is the change in entropy of a system (some physical substance of interest) after some motional energy (“heat”) has been transferred to it by fast-moving molecules. So, <math>\Delta S = S_{final} - S _{initial}</math>.


=== Classical calculation of entropy ===
* Then, <math> \Delta S = S_{final} - S _{initial} = \frac{q_{rev}}{T}</math>, the quotient of the motional energy (“heat”) q that is transferred "reversibly" (rev) to the system from the surroundings (or from another system in contact with the first system) divided by T, the absolute temperature at which the transfer occurs.
When the word 'entropy' was first defined and used in 1865, the very existence of atoms was still controversial, though it had long been speculated that temperature was due to the motion of microscopic constituents and that "heat" was the transferring of that motion from one place to another. Entropy change, <math>\Delta S</math>, was described in macroscopic terms that could be directly measured, such as volume, temperature, or pressure. However, today the classical equation of entropy, <math>\Delta S = \frac{q_\mathrm{rev}}{T}</math> can be explained, part by part, in modern terms describing how molecules are responsible for what is happening:
** “Reversible” or “reversibly” (rev) simply means that T, the temperature of the system, has to stay (almost) exactly the same while any energy is being transferred to or from it. That’s easy in the case of phase changes, where the system absolutely must stay in the solid or liquid form until enough energy is given to it to break bonds between the molecules before it can change to a liquid or a gas. For example in the melting of ice at 273.15 K, no matter what temperature the surroundings are – from 273.20 K to 500 K or even higher, the temperature of the ice will stay at 273.15 K until the last molecules in the ice are changed to liquid water, i.e., until all the hydrogen bonds between the water molecules in ice are broken and new, less-exactly fixed hydrogen bonds between liquid water molecules are formed. This amount of energy necessary for ice melting per mole has been found to be 6008 joules at 273 K. Therefore, the entropy change per mole is <math>\frac{q_{rev}}{T} = \frac{6008 J}{273 K}</math>, or 22 J/K.
* <math>\Delta S</math> is the change in entropy of a system (some physical substance of interest) after some motional energy ("heat") has been transferred to it by fast-moving molecules. So, <math>\Delta S = S_\mathrm{final} - S _\mathrm{initial}</math>.
** When the temperature isn't at the melting or boiling point of a substance no intermolecular bond-breaking is possible, and so any motional molecular energy (“heat”) from the surroundings transferred to a system raises its temperature, making its molecules move faster and faster. As the temperature is constantly rising, there is no longer a particular value of “T” at which energy is transferred. However, a "reversible" energy transfer can be measured at a very small temperature increase, and a cumulative total can be found by adding each of many small temperature intervals or increments. For example, to find the entropy change <math>\frac{q_{rev}}{T}</math> from 300 K to 310 K, measure the amount of energy transferred at dozens or hundreds of temperature increments, say from 300.00 K to 300.01 K and then 300.01 to 300.02 and so on, dividing the q by each T, and finally adding them all.
* Then, <math> \Delta S = S_\mathrm{final} - S _\mathrm{initial} = \frac{q_\mathrm{rev}}{T}</math>, the quotient of the motional energy ("heat") q that is transferred "reversibly" (rev) to the system from the surroundings (or from another system in contact with the first system) divided by T, the absolute temperature at which the transfer occurs.
** Calculus can be used to make this calculation easier if the effect of energy input to the system is linearly dependent on the temperature change, as in simple heating of a system at moderate to relatively high temperatures. Thus, the energy being transferred “per incremental change in temperature” (the heat capacity, <math>C_p</math>), multiplied by the [[integral]] of <math>\frac{dT}{T}</math> from <math>T_{initial}</math> to <math>T_{final}</math>, is directly given by <math>\Delta S = C_p \ln\frac{T_{final}}{T_{initial}}</math>.
** "Reversible" or "reversibly" (rev) simply means that T, the temperature of the system, has to stay (almost) exactly the same while any energy is being transferred to or from it. That is easy in the case of phase changes, where the system absolutely must stay in the solid or liquid form until enough energy is given to it to break bonds between the molecules before it can change to a liquid or a gas. For example, in the melting of ice at 273.15 K, no matter what temperature the surroundings are – from 273.20 K to 500 K or even higher, the temperature of the ice will stay at 273.15 K until the last molecules in the ice are changed to liquid water, i.e., until all the hydrogen bonds between the water molecules in ice are broken and new, less-exactly fixed hydrogen bonds between liquid water molecules are formed. This amount of energy necessary for ice melting per mole has been found to be 6008 joules at 273 K. Therefore, the entropy change per mole is <math>\frac{q_\mathrm{rev}}{T} = \frac{6008\,\mathrm J}{273\,\mathrm K}</math>, or 22 J/K.
** When the temperature is not at the melting or boiling point of a substance no intermolecular bond-breaking is possible, and so any motional molecular energy ("heat") from the surroundings transferred to a system raises its temperature, making its molecules move faster and faster. As the temperature is constantly rising, there is no longer a particular value of "T" at which energy is transferred. However, a "reversible" energy transfer can be measured at a very small temperature increase, and a cumulative total can be found by adding each of many small temperature intervals or increments. For example, to find the entropy change <math>\frac{q_\mathrm{rev}}{T}</math> from 300 K to 310 K, measure the amount of energy transferred at dozens or hundreds of temperature increments, say from 300.00 K to 300.01 K and then 300.01 to 300.02 and so on, dividing the q by each T, and finally adding them all.
** Calculus can be used to make this calculation easier if the effect of energy input to the system is linearly dependent on the temperature change, as in simple heating of a system at moderate to relatively high temperatures. Thus, the energy being transferred "per incremental change in temperature" (the heat capacity, <math>C_p</math>), multiplied by the [[integral]] of <math>\frac{dT}{T}</math> from <math>T_\mathrm{initial}</math> to <math>T_\mathrm{final}</math>, is directly given by <math>\Delta S = C_p \ln\frac{T_\mathrm{final}}{T_\mathrm{initial}}</math>.


==Introductory descriptions of entropy==
== Alternate explanations of entropy==
Traditionally, 20th century textbooks have introduced [[Entropy (order and disorder)|entropy as order and disorder]] so that it provides "a measurement of the disorder or randomness of a system". It has been argued that ambiguities in the terms used (such as "disorder" and "chaos") contribute to widespread confusion and can hinder comprehension of entropy for most students. A more recent formulation associated with [[Frank L. Lambert]] describing [[Entropy (energy dispersal)|entropy as energy dispersal]].<ref>[http://entropysite.oxy.edu/ welcome to entropysite.]</ref>


=== Thermodynamic entropy===
==See also==

*[[Entropy (energy dispersal)]]
* '''A measure of energy unavailable for work''': This is an often-repeated phrase which, although it is true, requires considerable clarification to be understood. It is only true for cyclic reversible processes, and is in this sense misleading. By "work" is meant moving an object, for example, lifting a weight, or bringing a flywheel up to speed, or carrying a load up a hill. To convert heat into work, using a coal-burning steam engine, for example, one must have two systems at different temperatures, and the amount of work you can extract depends on how large the temperature difference is, and how large the systems are. If one of the systems is at room temperature, and the other system is much larger, and near absolute zero temperature, then almost ALL of the energy of the room temperature system can be converted to work. If they are both at the same room temperature, then NONE of the energy of the room temperature system can be converted to work. Entropy is then a measure of how much energy cannot be converted to work, given these conditions. More precisely, for an isolated system comprising two closed systems at different temperatures, in the process of reaching equilibrium the amount of entropy lost by the hot system, multiplied by the temperature of the hot system, is the amount of energy that cannot converted to work.
*[[Second law of thermodynamics]]
* '''An indicator of irreversibility''': fitting closely with the 'unavailability of energy' interpretation is the 'irreversibility' interpretation. Spontaneous thermodynamic processes are irreversible, in the sense that they do not spontaneously undo themselves. Thermodynamic processes artificially imposed by agents in the surroundings of a body also have irreversible effects on the body. For example, when [[James Prescott Joule]] used a device that delivered a measured amount of mechanical work from the surroundings through a paddle that stirred a body of water, the energy transferred was received by the water as heat. There was scarce expansion of the water doing thermodynamic work back on the surroundings. The body of water showed no sign of returning the energy by stirring the paddle in reverse. The work transfer appeared as heat, and was not recoverable without a suitably cold reservoir in the surroundings. Entropy gives a precise account of such irreversibility.
*[[Statistical mechanics]]
* [[Entropy (energy dispersal)|'''Dispersal''']]: [[Edward A. Guggenheim]] proposed an ordinary language interpretation of entropy that may be rendered as "dispersal of modes of microscopic motion throughout their accessible range".<ref name="Dugdale 101">Dugdale, J.S. (1996). ''Entropy and its Physical Meaning'', Taylor & Francis, London, {{ISBN|0748405682}}, Dugdale cites only Guggenheim, on page 101.</ref><ref name="Guggenheim1949">Guggenheim, E.A. (1949), Statistical basis of thermodynamics, ''Research: A Journal of Science and its Applications'', '''2''', Butterworths, London, pp. 450–454; p. 453, "If instead of entropy one reads number of accessible states, or spread, the physical significance becomes clear."</ref> Later, along with a criticism of the idea of entropy as 'disorder', the dispersal interpretation was advocated by [[Frank L. Lambert]],<ref name=Lambert/><ref name="Lambert2005">{{cite journal |last1=Kozliak |first1=Evguenii I. |last2=Lambert |first2=Frank L.|date=2005 |title="Order-to-Disorder" for Entropy Change? Consider the Numbers!|journal=Chem. Educator |volume=10 |pages= 24–25|url=http://franklambert.net/entropysite.com/order_to_disorder.pdf}}</ref> and is used in some student textbooks.<ref>For example: Atkins, P. W., de Paula J. Atkins' Physical Chemistry, 2006, W.H. Freeman and Company, 8th edition, {{ISBN|9780716787594}}. Brown, T. L., H. E. LeMay, B. E. Bursten, C.J. Murphy, P. Woodward, M.E. Stoltzfus 2017. Chemistry: The Central Science, 10th ed. Prentice Hall, 1248pp, {{ISBN|9780134414232}}. Ebbing, D.D., and S. D. Gammon, 2017. General Chemistry, 11th ed. Centage Learning 1190pp, {{ISBN|9781305580343}}. Petrucci, Herring, Madura, Bissonnette 2011 General Chemistry: Principles and Modern Applications, 10th edition, 1426 pages, Pearson Canada {{ISBN|9780132064521}}.</ref>
: The interpretation properly refers to dispersal in abstract microstate spaces, but it may be loosely visualised in some simple examples of spatial spread of matter or energy. If a partition is removed from between two different gases, the molecules of each gas spontaneously disperse as widely as possible into their respectively newly accessible volumes; this may be thought of as mixing. If a partition, that blocks heat transfer between two bodies of different temperatures, is removed so that heat can pass between the bodies, then energy spontaneously disperses or spreads as heat from the hotter to the colder.
: Beyond such loose visualizations, in a general thermodynamic process, considered microscopically, spontaneous dispersal occurs in abstract microscopic [[phase space]]. According to Newton's and other laws of motion, phase space provides a systematic scheme for the description of the diversity of microscopic motion that occurs in bodies of matter and radiation. The second law of thermodynamics may be regarded as quantitatively accounting for the intimate interactions, dispersal, or mingling of such microscopic motions. In other words, entropy may be regarded as measuring the extent of diversity of motions of microscopic constituents of bodies of matter and radiation in their own states of internal thermodynamic equilibrium.

=== Information entropy and statistical mechanics ===

* [[Entropy (order and disorder)|'''As a measure of disorder''']]: Traditionally, 20th century textbooks have introduced [[Entropy (order and disorder)|entropy as order and disorder]] so that it provides "a measurement of the disorder or randomness of a system". It has been argued that ambiguities in, and arbitrary interpretations of, the terms used (such as "disorder" and "chaos") contribute to widespread confusion and can hinder comprehension of entropy for most students. On the other hand, in a convenient though arbitrary interpretation, "disorder" may be sharply defined as the [[Entropy (information theory)|Shannon entropy]] of the probability distribution of microstates given a particular macrostate,<ref name="Callen1985">{{cite book|title=Thermodynamics and an Introduction to Thermostatistics|last=Callen|first=Herbert B.|date=1985|publisher=John Wiley & Sons|isbn=0-471-86256-8|edition=2nd|location=New York|author-link=Herbert Callen}}</ref>{{rp|379}} in which case the [[Entropy in thermodynamics and information theory|connection of "disorder" to thermodynamic entropy]] is straightforward, but arbitrary and not immediately obvious to anyone unfamiliar with information theory.
* '''Missing information''': The idea that information entropy is a measure of how much one does not know about a system is quite useful.
: If, instead of using the natural logarithm to define information entropy, we instead use the base 2 logarithm, then the information entropy is roughly equal to the average number of (carefully chosen <ref name="questions">The minimum number of questions is achieved when each question either gives an answer with certainty, or cuts the remaining uncertainty in half. For example, it we had a probability function <math>P_i = (1/8,1/2,1/8,1/4)</math> associated with a variable <math>x=(x_1,x_2,x_3,x_4)</math>, then the optimum mode of questioning would be to first ask "is x equal to x<sub>2</sub>?" If the answer is "yes", then ''x'' is certainly equal to x<sub>2</sub> after asking only one question, and the probability of this happening is ''P''<sub>2</sub> = 1/2. If the answer is "no", then the next question would be "Is ''x'' equal to ''x''<sub>4</sub>? If the answer is yes, then ''x'' is certainly equal to ''x''<sub>4</sub> after asking two questions, and the probability of this happening is ''P''<sub>4</sub> = 1/4. If the answer is "no", we may finally ask "is ''x'' equal to ''x''<sub>1</sub>?. If the answer is yes, then the ''x'' is certainly equal to ''x''<sub>1</sub> and if not, then ''x'' is certainly equal to ''x''<sub>3</sub>, and the probability of requiring three questions is ''P''<sub>1</sub> + ''P''<sub>3</sub> = 1/4. The average number of binary questions asked is then ''Q'' = (1/2)(1)+(1/4)(2)+(1/4)(3) = 7/4. Calculating the Shannon information entropy:
: <math>Q=-\sum_{i=1}^4 P_i \log_2(P_i) = 7/4</math> Sh
which is in agreement with the step-by-step procedure. In most cases, it is not clear how to continually divide the remaining options in half with each question so the concept is strictly applicable only for special cases, and becomes more accurate as the number of possible outcomes increases. Nevertheless, the Shannon expression for ''Q'' is valid even in these cases.</ref>) yes/no questions that would have to be asked to get complete information about the system under study. In the introductory example of two flipped coins, the information entropy for the macrostate which contains one head and one tail, one would only need one question to determine its exact state, (e.g. is the first one heads?") and instead of expressing the entropy as ln(2) one could say, equivalently, that it is log<sub>2</sub>(2) which equals the number of binary questions we would need to ask: One. When measuring entropy using the natural logarithm (ln), the unit of information entropy is called a "nat", but when it is measured using the base-2 logarithm, the unit of information entropy is called a "shannon" (alternatively, "bit"). This is just a difference in units, much like the difference between inches and centimeters. (1&nbsp;nat = log<sub>2</sub>''e''&nbsp;shannons). Thermodynamic entropy is equal to the Boltzmann constant times the information entropy expressed in nats. The information entropy expressed with the unit [[shannon (unit)|shannon]] (Sh) is equal to the number of yes–no questions that need to be answered in order to determine the microstate from the macrostate.
: The concepts of "disorder" and "spreading" can be analyzed with this information entropy concept in mind. For example, if we take a new deck of cards out of the box, it is arranged in "perfect order" (spades, hearts, diamonds, clubs, each suit beginning with the ace and ending with the king), we may say that we then have an "ordered" deck with an information entropy of zero. If we thoroughly shuffle the deck, the information entropy will be about 225.6&nbsp;shannons: We will need to ask about 225.6 questions, on average, to determine the exact order of the shuffled deck. We can also say that the shuffled deck has become completely "disordered" or that the ordered cards have been "spread" throughout the deck. But information entropy does not say that the deck needs to be ordered in any particular way. If we take our shuffled deck and write down the names of the cards, in order, then the information entropy becomes zero. If we again shuffle the deck, the information entropy would again be about 225.6&nbsp;shannons, even if by some miracle it reshuffled to the same order as when it came out of the box, because even if it did, we would not know that. So the concept of "disorder" is useful if, by order, we mean maximal knowledge and by disorder we mean maximal lack of knowledge. The "spreading" concept is useful because it gives a feeling to what happens to the cards when they are shuffled. The probability of a card being in a particular place in an ordered deck is either 0 or 1, in a shuffled deck it is 1/52. The probability has "spread out" over the entire deck. Analogously, in a physical system, entropy is generally associated with a "spreading out" of mass or energy.
: The connection between thermodynamic entropy and information entropy is given by Boltzmann's equation, which says that {{nowrap|1=''S'' = ''k''<sub>B</sub> ln ''W''}}. If we take the base-2 logarithm of ''W'', it will yield the average number of questions we must ask about the microstate of the physical system in order to determine its macrostate.<ref>In classical mechanics the velocities and positions are real numbers, and there is a [[Continuum (measurement)|continuum]] of an infinite number of microstates. This would mean that an infinite number of questions would have to be asked in order to determine a macrostate. In quantum mechanics, the microstates are "quantized" and there are a finite number of them for a given energy, so the number of questions is finite. Boltzmann developed his theory before the advent of quantum mechanics, and it is to his credit that he was nevertheless able to develop a theory that dealt with a theoretically infinite number of microstates.</ref>

== See also ==
* [[Entropy (classical thermodynamics)]]
* [[Entropy (energy dispersal)]]
* [[Second law of thermodynamics]]
* [[Statistical mechanics]]
* [[Thermodynamics]]
* [[Thermodynamics]]
* [[List of textbooks in thermodynamics and statistical mechanics|List of textbooks on thermodynamics and statistical mechanics]]

== References ==
{{reflist}}


== Further reading ==
==References==
* {{cite book|author=Goldstein, Martin and Inge F.|year=1993|title=The Refrigerator and the Universe: Understanding the Laws of Energy|publisher=Harvard Univ. Press|url=https://books.google.com/books?id=PDnG4dtaixYC|isbn=9780674753259}} Chapters 4–12 touch on entropy.
{{Reflist}}


{{Introductory science articles}}
==Further reading==
*{{cite book|author=Goldstein, Martin and Inge F.|year=1993|title=The Refrigerator and the Universe: Understanding the Laws of Energy|publisher=Harvard Univ. Press|url=http://books.google.com/?id=PDnG4dtaixYC|isbn=9780674753259}} chapters=4-12 touch on entropy


{{DEFAULTSORT:Introduction To Entropy}}
{{DEFAULTSORT:Introduction To Entropy}}
[[Category:Thermodynamic entropy]]
[[Category:Thermodynamic entropy]]
[[Category:Introduction articles|Entropy]]

Latest revision as of 14:53, 18 September 2024

In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, cream and coffee can be mixed together, but cannot be "unmixed"; a piece of wood can be burned, but cannot be "unburned". The word 'entropy' has entered popular usage to refer to a lack of order or predictability, or of a gradual decline into disorder.[1] A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.

If a movie that shows coffee being mixed or wood being burned is played in reverse, it would depict processes highly improbable in reality. Mixing coffee and burning wood are "irreversible". Irreversibility is described by a law of nature known as the second law of thermodynamics, which states that in an isolated system (a system not connected to any other system) which is undergoing change, entropy increases over time.[2]

Entropy does not increase indefinitely. A body of matter and radiation eventually will reach an unchanging state, with no detectable flows, and is then said to be in a state of thermodynamic equilibrium. Thermodynamic entropy has a definite value for such a body and is at its maximum value. When bodies of matter or radiation, initially in their own states of internal thermodynamic equilibrium, are brought together so as to intimately interact and reach a new joint equilibrium, then their total entropy increases. For example, a glass of warm water with an ice cube in it will have a lower entropy than that same system some time later when the ice has melted leaving a glass of cool water. Such processes are irreversible: A glass of cool water will not spontaneously turn into a glass of warm water with an ice cube in it. Some processes in nature are almost reversible. For example, the orbiting of the planets around the Sun may be thought of as practically reversible: A movie of the planets orbiting the Sun which is run in reverse would not appear to be impossible.

While the second law, and thermodynamics in general, accurately predicts the intimate interactions of complex physical systems, scientists are not content with simply knowing how a system behaves, they also want to know why it behaves the way it does. The question of why entropy increases until equilibrium is reached was answered in 1877 by physicist Ludwig Boltzmann. The theory developed by Boltzmann and others, is known as statistical mechanics. Statistical mechanics explains thermodynamics in terms of the statistical behavior of the atoms and molecules which make up the system. The theory not only explains thermodynamics, but also a host of other phenomena which are outside the scope of thermodynamics.

Explanation

[edit]

Thermodynamic entropy

[edit]

The concept of thermodynamic entropy arises from the second law of thermodynamics. This law of entropy increase quantifies the reduction in the capacity of an isolated compound thermodynamic system to do thermodynamic work on its surroundings, or indicates whether a thermodynamic process may occur. For example, whenever there is a suitable pathway, heat spontaneously flows from a hotter body to a colder one.

Thermodynamic entropy is measured as a change in entropy () to a system containing a sub-system which undergoes heat transfer to its surroundings (inside the system of interest). It is based on the macroscopic relationship between heat flow into the sub-system and the temperature at which it occurs summed over the boundary of that sub-system.

Following the formalism of Clausius, the basic calculation can be mathematically stated as:[3]

where is the increase or decrease in entropy, is the heat added to the system or subtracted from it, and is temperature. The 'equals' sign and the symbol imply that the heat transfer should be so small and slow that it scarcely changes the temperature .

If the temperature is allowed to vary, the equation must be integrated over the temperature path. This calculation of entropy change does not allow the determination of absolute value, only differences. In this context, the Second Law of Thermodynamics may be stated that for heat transferred over any valid process for any system, whether isolated or not,

According to the first law of thermodynamics, which deals with the conservation of energy, the loss of heat will result in a decrease in the internal energy of the thermodynamic system. Thermodynamic entropy provides a comparative measure of the amount of decrease in internal energy and the corresponding increase in internal energy of the surroundings at a given temperature. In many cases, a visualization of the second law is that energy of all types changes from being localized to becoming dispersed or spread out, if it is not hindered from doing so. When applicable, entropy increase is the quantitative measure of that kind of a spontaneous process: how much energy has been effectively lost or become unavailable, by dispersing itself, or spreading itself out, as assessed at a specific temperature. For this assessment, when the temperature is higher, the amount of energy dispersed is assessed as 'costing' proportionately less. This is because a hotter body is generally more able to do thermodynamic work, other factors, such as internal energy, being equal. This is why a steam engine has a hot firebox.

The second law of thermodynamics deals only with changes of entropy (). The absolute entropy (S) of a system may be determined using the third law of thermodynamics, which specifies that the entropy of all perfectly crystalline substances is zero at the absolute zero of temperature.[4] The entropy at another temperature is then equal to the increase in entropy on heating the system reversibly from absolute zero to the temperature of interest.[5]

Statistical mechanics and information entropy

[edit]

Thermodynamic entropy bears a close relationship to the concept of information entropy (H). Information entropy is a measure of the "spread" of a probability density or probability mass function. Thermodynamics makes no assumptions about the atomistic nature of matter, but when matter is viewed in this way, as a collection of particles constantly moving and exchanging energy with each other, and which may be described in a probabilistic manner, information theory may be successfully applied to explain the results of thermodynamics. The resulting theory is known as statistical mechanics.

An important concept in statistical mechanics is the idea of the microstate and the macrostate of a system. If we have a container of gas, for example, and we know the position and velocity of every molecule in that system, then we know the microstate of that system. If we only know the thermodynamic description of that system, the pressure, volume, temperature, and/or the entropy, then we know the macrostate of that system. Boltzmann realized that there are many different microstates that can yield the same macrostate, and, because the particles are colliding with each other and changing their velocities and positions, the microstate of the gas is always changing. But if the gas is in equilibrium, there seems to be no change in its macroscopic behavior: No changes in pressure, temperature, etc. Statistical mechanics relates the thermodynamic entropy of a macrostate to the number of microstates that could yield that macrostate. In statistical mechanics, the entropy of the system is given by Ludwig Boltzmann's equation:

where S is the thermodynamic entropy, W is the number of microstates that may yield the macrostate, and is the Boltzmann constant. The natural logarithm of the number of microstates () is known as the information entropy of the system. This can be illustrated by a simple example:

If you flip two coins, you can have four different results. If H is heads and T is tails, we can have (H,H), (H,T), (T,H), and (T,T). We can call each of these a "microstate" for which we know exactly the results of the process. But what if we have less information? Suppose we only know the total number of heads?. This can be either 0, 1, or 2. We can call these "macrostates". Only microstate (T,T) will give macrostate zero, (H,T) and (T,H) will give macrostate 1, and only (H,H) will give macrostate 2. So we can say that the information entropy of macrostates 0 and 2 are ln(1) which is zero, but the information entropy of macrostate 1 is ln(2) which is about 0.69. Of all the microstates, macrostate 1 accounts for half of them.

It turns out that if you flip a large number of coins, the macrostates at or near half heads and half tails accounts for almost all of the microstates. In other words, for a million coins, you can be fairly sure that about half will be heads and half tails. The macrostates around a 50–50 ratio of heads to tails will be the "equilibrium" macrostate. A real physical system in equilibrium has a huge number of possible microstates and almost all of them are the equilibrium macrostate, and that is the macrostate you will almost certainly see if you wait long enough. In the coin example, if you start out with a very unlikely macrostate (like all heads, for example with zero entropy) and begin flipping one coin at a time, the entropy of the macrostate will start increasing, just as thermodynamic entropy does, and after a while, the coins will most likely be at or near that 50–50 macrostate, which has the greatest information entropy – the equilibrium entropy.

The macrostate of a system is what we know about the system, for example the temperature, pressure, and volume of a gas in a box. For each set of values of temperature, pressure, and volume there are many arrangements of molecules which result in those values. The number of arrangements of molecules which could result in the same values for temperature, pressure and volume is the number of microstates.

The concept of information entropy has been developed to describe any of several phenomena, depending on the field and the context in which it is being used. When it is applied to the problem of a large number of interacting particles, along with some other constraints, like the conservation of energy, and the assumption that all microstates are equally likely, the resultant theory of statistical mechanics is extremely successful in explaining the laws of thermodynamics.

Ice melting provides an example of entropy increasing.

Example of increasing entropy

[edit]

Ice melting provides an example in which entropy increases in a small system, a thermodynamic system consisting of the surroundings (the warm room) and the entity of glass container, ice and water which has been allowed to reach thermodynamic equilibrium at the melting temperature of ice. In this system, some heat (δQ) from the warmer surroundings at 298 K (25 °C; 77 °F) transfers to the cooler system of ice and water at its constant temperature (T) of 273 K (0 °C; 32 °F), the melting temperature of ice. The entropy of the system, which is δQ/T, increases by δQ/273 K. The heat δQ for this process is the energy required to change water from the solid state to the liquid state, and is called the enthalpy of fusion, i.e. ΔH for ice fusion.

The entropy of the surrounding room decreases less than the entropy of the ice and water increases: the room temperature of 298 K is larger than 273 K and therefore the ratio, (entropy change), of δQ/298 K for the surroundings is smaller than the ratio (entropy change), of δQ/273 K for the ice and water system. This is always true in spontaneous events in a thermodynamic system and it shows the predictive importance of entropy: the final net entropy after such an event is always greater than was the initial entropy.

As the temperature of the cool water rises to that of the room and the room further cools imperceptibly, the sum of the δQ/T over the continuous range, "at many increments", in the initially cool to finally warm water can be found by calculus. The entire miniature 'universe', i.e. this thermodynamic system, has increased in entropy. Energy has spontaneously become more dispersed and spread out in that 'universe' than when the glass of ice and water was introduced and became a 'system' within it.

Origins and uses

[edit]

Originally, entropy was named to describe the "waste heat", or more accurately, energy loss, from heat engines and other mechanical devices which could never run with 100% efficiency in converting energy into work. Later, the term came to acquire several additional descriptions, as more was understood about the behavior of molecules on the microscopic level. In the late 19th century, the word "disorder" was used by Ludwig Boltzmann in developing statistical views of entropy using probability theory to describe the increased molecular movement on the microscopic level. That was before quantum behavior came to be better understood by Werner Heisenberg and those who followed. Descriptions of thermodynamic (heat) entropy on the microscopic level are found in statistical thermodynamics and statistical mechanics.

For most of the 20th century, textbooks tended to describe entropy as "disorder", following Boltzmann's early conceptualisation of the "motional" (i.e. kinetic) energy of molecules. More recently, there has been a trend in chemistry and physics textbooks to describe entropy as energy dispersal.[6] Entropy can also involve the dispersal of particles, which are themselves energetic. Thus there are instances where both particles and energy disperse at different rates when substances are mixed together.

The mathematics developed in statistical thermodynamics were found to be applicable in other disciplines. In particular, information sciences developed the concept of information entropy, which lacks the Boltzmann constant inherent in thermodynamic entropy.

Classical calculation of entropy

[edit]

When the word 'entropy' was first defined and used in 1865, the very existence of atoms was still controversial, though it had long been speculated that temperature was due to the motion of microscopic constituents and that "heat" was the transferring of that motion from one place to another. Entropy change, , was described in macroscopic terms that could be directly measured, such as volume, temperature, or pressure. However, today the classical equation of entropy, can be explained, part by part, in modern terms describing how molecules are responsible for what is happening:

  • is the change in entropy of a system (some physical substance of interest) after some motional energy ("heat") has been transferred to it by fast-moving molecules. So, .
  • Then, , the quotient of the motional energy ("heat") q that is transferred "reversibly" (rev) to the system from the surroundings (or from another system in contact with the first system) divided by T, the absolute temperature at which the transfer occurs.
    • "Reversible" or "reversibly" (rev) simply means that T, the temperature of the system, has to stay (almost) exactly the same while any energy is being transferred to or from it. That is easy in the case of phase changes, where the system absolutely must stay in the solid or liquid form until enough energy is given to it to break bonds between the molecules before it can change to a liquid or a gas. For example, in the melting of ice at 273.15 K, no matter what temperature the surroundings are – from 273.20 K to 500 K or even higher, the temperature of the ice will stay at 273.15 K until the last molecules in the ice are changed to liquid water, i.e., until all the hydrogen bonds between the water molecules in ice are broken and new, less-exactly fixed hydrogen bonds between liquid water molecules are formed. This amount of energy necessary for ice melting per mole has been found to be 6008 joules at 273 K. Therefore, the entropy change per mole is , or 22 J/K.
    • When the temperature is not at the melting or boiling point of a substance no intermolecular bond-breaking is possible, and so any motional molecular energy ("heat") from the surroundings transferred to a system raises its temperature, making its molecules move faster and faster. As the temperature is constantly rising, there is no longer a particular value of "T" at which energy is transferred. However, a "reversible" energy transfer can be measured at a very small temperature increase, and a cumulative total can be found by adding each of many small temperature intervals or increments. For example, to find the entropy change from 300 K to 310 K, measure the amount of energy transferred at dozens or hundreds of temperature increments, say from 300.00 K to 300.01 K and then 300.01 to 300.02 and so on, dividing the q by each T, and finally adding them all.
    • Calculus can be used to make this calculation easier if the effect of energy input to the system is linearly dependent on the temperature change, as in simple heating of a system at moderate to relatively high temperatures. Thus, the energy being transferred "per incremental change in temperature" (the heat capacity, ), multiplied by the integral of from to , is directly given by .

Alternate explanations of entropy

[edit]

Thermodynamic entropy

[edit]
  • A measure of energy unavailable for work: This is an often-repeated phrase which, although it is true, requires considerable clarification to be understood. It is only true for cyclic reversible processes, and is in this sense misleading. By "work" is meant moving an object, for example, lifting a weight, or bringing a flywheel up to speed, or carrying a load up a hill. To convert heat into work, using a coal-burning steam engine, for example, one must have two systems at different temperatures, and the amount of work you can extract depends on how large the temperature difference is, and how large the systems are. If one of the systems is at room temperature, and the other system is much larger, and near absolute zero temperature, then almost ALL of the energy of the room temperature system can be converted to work. If they are both at the same room temperature, then NONE of the energy of the room temperature system can be converted to work. Entropy is then a measure of how much energy cannot be converted to work, given these conditions. More precisely, for an isolated system comprising two closed systems at different temperatures, in the process of reaching equilibrium the amount of entropy lost by the hot system, multiplied by the temperature of the hot system, is the amount of energy that cannot converted to work.
  • An indicator of irreversibility: fitting closely with the 'unavailability of energy' interpretation is the 'irreversibility' interpretation. Spontaneous thermodynamic processes are irreversible, in the sense that they do not spontaneously undo themselves. Thermodynamic processes artificially imposed by agents in the surroundings of a body also have irreversible effects on the body. For example, when James Prescott Joule used a device that delivered a measured amount of mechanical work from the surroundings through a paddle that stirred a body of water, the energy transferred was received by the water as heat. There was scarce expansion of the water doing thermodynamic work back on the surroundings. The body of water showed no sign of returning the energy by stirring the paddle in reverse. The work transfer appeared as heat, and was not recoverable without a suitably cold reservoir in the surroundings. Entropy gives a precise account of such irreversibility.
  • Dispersal: Edward A. Guggenheim proposed an ordinary language interpretation of entropy that may be rendered as "dispersal of modes of microscopic motion throughout their accessible range".[7][8] Later, along with a criticism of the idea of entropy as 'disorder', the dispersal interpretation was advocated by Frank L. Lambert,[6][9] and is used in some student textbooks.[10]
The interpretation properly refers to dispersal in abstract microstate spaces, but it may be loosely visualised in some simple examples of spatial spread of matter or energy. If a partition is removed from between two different gases, the molecules of each gas spontaneously disperse as widely as possible into their respectively newly accessible volumes; this may be thought of as mixing. If a partition, that blocks heat transfer between two bodies of different temperatures, is removed so that heat can pass between the bodies, then energy spontaneously disperses or spreads as heat from the hotter to the colder.
Beyond such loose visualizations, in a general thermodynamic process, considered microscopically, spontaneous dispersal occurs in abstract microscopic phase space. According to Newton's and other laws of motion, phase space provides a systematic scheme for the description of the diversity of microscopic motion that occurs in bodies of matter and radiation. The second law of thermodynamics may be regarded as quantitatively accounting for the intimate interactions, dispersal, or mingling of such microscopic motions. In other words, entropy may be regarded as measuring the extent of diversity of motions of microscopic constituents of bodies of matter and radiation in their own states of internal thermodynamic equilibrium.

Information entropy and statistical mechanics

[edit]
  • As a measure of disorder: Traditionally, 20th century textbooks have introduced entropy as order and disorder so that it provides "a measurement of the disorder or randomness of a system". It has been argued that ambiguities in, and arbitrary interpretations of, the terms used (such as "disorder" and "chaos") contribute to widespread confusion and can hinder comprehension of entropy for most students. On the other hand, in a convenient though arbitrary interpretation, "disorder" may be sharply defined as the Shannon entropy of the probability distribution of microstates given a particular macrostate,[11]: 379  in which case the connection of "disorder" to thermodynamic entropy is straightforward, but arbitrary and not immediately obvious to anyone unfamiliar with information theory.
  • Missing information: The idea that information entropy is a measure of how much one does not know about a system is quite useful.
If, instead of using the natural logarithm to define information entropy, we instead use the base 2 logarithm, then the information entropy is roughly equal to the average number of (carefully chosen [12]) yes/no questions that would have to be asked to get complete information about the system under study. In the introductory example of two flipped coins, the information entropy for the macrostate which contains one head and one tail, one would only need one question to determine its exact state, (e.g. is the first one heads?") and instead of expressing the entropy as ln(2) one could say, equivalently, that it is log2(2) which equals the number of binary questions we would need to ask: One. When measuring entropy using the natural logarithm (ln), the unit of information entropy is called a "nat", but when it is measured using the base-2 logarithm, the unit of information entropy is called a "shannon" (alternatively, "bit"). This is just a difference in units, much like the difference between inches and centimeters. (1 nat = log2e shannons). Thermodynamic entropy is equal to the Boltzmann constant times the information entropy expressed in nats. The information entropy expressed with the unit shannon (Sh) is equal to the number of yes–no questions that need to be answered in order to determine the microstate from the macrostate.
The concepts of "disorder" and "spreading" can be analyzed with this information entropy concept in mind. For example, if we take a new deck of cards out of the box, it is arranged in "perfect order" (spades, hearts, diamonds, clubs, each suit beginning with the ace and ending with the king), we may say that we then have an "ordered" deck with an information entropy of zero. If we thoroughly shuffle the deck, the information entropy will be about 225.6 shannons: We will need to ask about 225.6 questions, on average, to determine the exact order of the shuffled deck. We can also say that the shuffled deck has become completely "disordered" or that the ordered cards have been "spread" throughout the deck. But information entropy does not say that the deck needs to be ordered in any particular way. If we take our shuffled deck and write down the names of the cards, in order, then the information entropy becomes zero. If we again shuffle the deck, the information entropy would again be about 225.6 shannons, even if by some miracle it reshuffled to the same order as when it came out of the box, because even if it did, we would not know that. So the concept of "disorder" is useful if, by order, we mean maximal knowledge and by disorder we mean maximal lack of knowledge. The "spreading" concept is useful because it gives a feeling to what happens to the cards when they are shuffled. The probability of a card being in a particular place in an ordered deck is either 0 or 1, in a shuffled deck it is 1/52. The probability has "spread out" over the entire deck. Analogously, in a physical system, entropy is generally associated with a "spreading out" of mass or energy.
The connection between thermodynamic entropy and information entropy is given by Boltzmann's equation, which says that S = kB ln W. If we take the base-2 logarithm of W, it will yield the average number of questions we must ask about the microstate of the physical system in order to determine its macrostate.[13]

See also

[edit]

References

[edit]
  1. ^ "Definition of entropy in English". Lexico Powered By Oxford. Archived from the original on July 11, 2019. Retrieved 18 November 2020.
  2. ^ Theoretically, coffee can be "unmixed" and wood can be "unburned", but this would need a "machine" that would generate more entropy than was lost in the original process. This is why the second law only holds for isolated system which means they cannot be connected to some external "machine".
  3. ^ I. Klotz, R. Rosenberg, Chemical Thermodynamics – Basic Concepts and Methods, 7th ed., Wiley (2008), p. 125
  4. ^ Atkins, Peter; de Paula, Julio (2006). Atkins' Physical Chemistry (8th ed.). W. H. Freeman. pp. 92–94. ISBN 0-7167-8759-8.
  5. ^ Laidler, Keith J.; Meiser, John H. (1982). Physical Chemistry. Benjamin/Cummings. p. 110. ISBN 0-8053-5682-7. Entropies can then be determined at other temperatures, by considering a series of reversible processes by which the temperature is raised from the absolute zero to the temperature in question.
  6. ^ a b Entropy Sites — A Guide Content selected by Frank L. Lambert
  7. ^ Dugdale, J.S. (1996). Entropy and its Physical Meaning, Taylor & Francis, London, ISBN 0748405682, Dugdale cites only Guggenheim, on page 101.
  8. ^ Guggenheim, E.A. (1949), Statistical basis of thermodynamics, Research: A Journal of Science and its Applications, 2, Butterworths, London, pp. 450–454; p. 453, "If instead of entropy one reads number of accessible states, or spread, the physical significance becomes clear."
  9. ^ Kozliak, Evguenii I.; Lambert, Frank L. (2005). ""Order-to-Disorder" for Entropy Change? Consider the Numbers!" (PDF). Chem. Educator. 10: 24–25.
  10. ^ For example: Atkins, P. W., de Paula J. Atkins' Physical Chemistry, 2006, W.H. Freeman and Company, 8th edition, ISBN 9780716787594. Brown, T. L., H. E. LeMay, B. E. Bursten, C.J. Murphy, P. Woodward, M.E. Stoltzfus 2017. Chemistry: The Central Science, 10th ed. Prentice Hall, 1248pp, ISBN 9780134414232. Ebbing, D.D., and S. D. Gammon, 2017. General Chemistry, 11th ed. Centage Learning 1190pp, ISBN 9781305580343. Petrucci, Herring, Madura, Bissonnette 2011 General Chemistry: Principles and Modern Applications, 10th edition, 1426 pages, Pearson Canada ISBN 9780132064521.
  11. ^ Callen, Herbert B. (1985). Thermodynamics and an Introduction to Thermostatistics (2nd ed.). New York: John Wiley & Sons. ISBN 0-471-86256-8.
  12. ^ The minimum number of questions is achieved when each question either gives an answer with certainty, or cuts the remaining uncertainty in half. For example, it we had a probability function associated with a variable , then the optimum mode of questioning would be to first ask "is x equal to x2?" If the answer is "yes", then x is certainly equal to x2 after asking only one question, and the probability of this happening is P2 = 1/2. If the answer is "no", then the next question would be "Is x equal to x4? If the answer is yes, then x is certainly equal to x4 after asking two questions, and the probability of this happening is P4 = 1/4. If the answer is "no", we may finally ask "is x equal to x1?. If the answer is yes, then the x is certainly equal to x1 and if not, then x is certainly equal to x3, and the probability of requiring three questions is P1 + P3 = 1/4. The average number of binary questions asked is then Q = (1/2)(1)+(1/4)(2)+(1/4)(3) = 7/4. Calculating the Shannon information entropy:
    Sh
    which is in agreement with the step-by-step procedure. In most cases, it is not clear how to continually divide the remaining options in half with each question so the concept is strictly applicable only for special cases, and becomes more accurate as the number of possible outcomes increases. Nevertheless, the Shannon expression for Q is valid even in these cases.
  13. ^ In classical mechanics the velocities and positions are real numbers, and there is a continuum of an infinite number of microstates. This would mean that an infinite number of questions would have to be asked in order to determine a macrostate. In quantum mechanics, the microstates are "quantized" and there are a finite number of them for a given energy, so the number of questions is finite. Boltzmann developed his theory before the advent of quantum mechanics, and it is to his credit that he was nevertheless able to develop a theory that dealt with a theoretically infinite number of microstates.

Further reading

[edit]