Jump to content

Entropy production

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 2601:8c3:8100:672:3504:fe23:ff51:5a88 (talk) at 00:16, 4 February 2021. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Entropy production (or generation) is the amount of entropy which is produced in any irreversible processes[1] such as heat and mass transfer processes including motion of bodies, heat exchange, fluid flow, substances expanding or mixing, anelastic deformation of solids, and any irreversible thermodynamic cycle, including thermal machines such as power plants, heat engines, refrigerators, heat pumps, and air conditioners.

In the dual representation entropy-exergy for accounting the second law of thermodynamics it can be expressed in equivalent terms of exergy disruption.

Rudolf Clausius

Short history

Entropy is produced in irreversible processes. The importance of avoiding irreversible processes (hence reducing the entropy production) was recognized as early as 1824 by Carnot.[2] In 1865 Rudolf Clausius expanded his previous work from 1854[3] on the concept of “unkompensierte Verwandlungen” (uncompensated transformations), which, in our modern nomenclature, would be called the entropy production. In the same article in which he introduced the name entropy,[4] Clausius gives the expression for the entropy production (for a closed system), which he denotes by N, in equation (71) which reads

Here S is the entropy in the final state and the integral is to be taken from the initial state to the final state. From the context, it is clear that N = 0 if the process is reversible and N > 0 in case of an irreversible process.

First and second law

Fig.1 General representation of an inhomogeneous system that consists of a number of subsystems. The interaction of the system with the surroundings is through exchange of heat and other forms of energy, flow of matter, and changes of shape. The internal interactions between the various subsystems are of a similar nature and lead to entropy production.

The laws of thermodynamics system apply to well-defined systems. Fig.1 is a general representation of a thermodynamic system. We consider systems which, in general, are inhomogeneous. Heat and mass are transferred across the boundaries (nonadiabatic, open systems), and the boundaries are moving (usually through pistons). In our formulation we assume that heat and mass transfer and volume changes take place only separately at well-defined regions of the system boundary. The expression, given here, are not the most general formulations of the first and second law. E.g. kinetic energy and potential energy terms are missing and exchange of matter by diffusion is excluded.

The rate of entropy production, denoted by , is a key element of the second law of thermodynamics for open inhomogeneous systems which reads

Here S is the entropy of the system; Tk is the temperature at which the heat flow enters the system; represents the entropy flow into the system at position k, due to matter flowing into the system ( are the molar flow and mass flow and Smk and sk are the molar entropy (i.e. entropy per mole) and specific entropy (i.e. entropy per unit mass) of the matter, flowing into the system, respectively); represents the entropy production rates due to internal processes. The index i in refers to the fact that the entropy is produced due to irreversible processes. The entropy-production rate of every process in nature is always positive or zero. This is an essential aspect of the second law.

The ∑'s indicate the algebraic sum of the respective contributions if there are more heat flows, matter flows, and internal processes.

In order to demonstrate the impact of the second law, and the role of entropy production, it has to be combined with the first law which reads

with U the internal energy of the system; the enthalpy flows into the system due to the matter that flows into the system (Hmk its molar enthalpy, hk the specific enthalpy (i.e. enthalpy per unit mass)), and dVk/dt are the rates of change of the volume of the system due to a moving boundary at position k while pk is the pressure behind that boundary; P represents all other forms of power application (such as electrical).

The first and second law have been formulated in terms of time derivatives of U and S rather than in terms of total differentials dU and dS where it is tacitly assumed that dt > 0. So, the formulation in terms of time derivatives is more elegant. An even bigger advantage of this formulation is, however, that it emphasizes that heat flow and power are the basic thermodynamic properties and that heat and work are derived quantities being the time integrals of the heat flow and the power respectively.

Examples of irreversible processes

Entropy is produced in irreversible processes. Some important irreversible processes are:

  • heat flow through a thermal resistance
  • fluid flow through a flow resistance such as in the Joule expansion or the Joule-Thomson effect
  • diffusion
  • chemical reactions
  • Joule heating
  • friction between solid surfaces
  • fluid viscosity within a system.

The expression for the rate of entropy production in the first two cases will be derived in separate sections.

Fig.2 a: Schematic diagram of a heat engine. A heating power enters the engine at the high temperature TH, and is released at ambient temperature Ta. A power P is produced and the entropy production rate is . b: Schematic diagram of a refrigerator. is the cooling power at the low temperature TL, and is released at ambient temperature. The power P is supplied and is the entropy production rate. The arrows define the positive directions of the flows of heat and power in the two cases. They are positive under normal operating conditions.

Performance of heat engines and refrigerators

Most heat engines and refrigerators are closed cyclic machines.[5] In the steady state the internal energy and the entropy of the machines after one cycle are the same as at the start of the cycle. Hence, on average, dU/dt = 0 and dS/dt = 0 since U and S are functions of state. Furthermore they are closed systems () and the volume is fixed (dV/dt = 0). This leads to a significant simplification of the first and second law:

and

The summation is over the (two) places where heat is added or removed.

Engines

For a heat engine (Fig.2a) the first and second law obtain the form

and

Here is the heat supplied at the high temperature TH, is the heat removed at ambient temperature Ta, and P is the power delivered by the engine. Eliminating gives

The efficiency is defined by

If the performance of the engine is at its maximum and the efficiency is equal to the Carnot efficiency

Refrigerators

For refrigerators (fig.2b) holds

and

Here P is the power, supplied to produce the cooling power at the low temperature TL. Eliminating now gives

The Coefficient Of Performance of refrigerators is defined by

If the performance of the cooler is at its maximum. The COP is then given by the Carnot Coefficient Of Performance

Power dissipation

In both cases we find a contribution which reduces the system performance. This product of ambient temperature and the (average) entropy production rate is called the dissipated power.

Equivalence with other formulations

It is interesting to investigate how the above mathematical formulation of the second law relates with other well-known formulations of the second law.

We first look at a heat engine, assuming that . In other words: the heat flow is completely converted into power. In this case the second law would reduce to

Since and this would result in which violates the condition that the entropy production is always positive. Hence: No process is possible in which the sole result is the absorption of heat from a reservoir and its complete conversion into work. This is the Kelvin statement of the second law.

Now look at the case of the refrigerator and assume that the input power is zero. In other words: heat is transported from a low temperature to a high temperature without doing work on the system. The first law with P =0 would give

and the second law then yields

or

Since and this would result in which again violates the condition that the entropy production is always positive. Hence: No process is possible whose sole result is the transfer of heat from a body of lower temperature to a body of higher temperature. This is the Clausius statement of the second law.

Expressions for the entropy production

Heat flow

In case of a heat flow from T1 to T2 (with ) the rate of entropy production is given by

If the heat flow is in a bar with length L, cross-sectional area A, and thermal conductivity κ, and the temperature difference is small

the entropy production rate is

Flow of mass

In case of a volume flow from a pressure p1 to p2

For small pressure drops and defining the flow conductance C by we get

The dependences of on (T1-T2) and on (p1-p2) are quadratic.

This is typical for expressions of the entropy production rates in general. They guarantee that the entropy production is positive.

Entropy of mixing

In this Section we will calculate the entropy of mixing when two ideal gases diffuse into each other. Consider a volume Vt divided in two volumes Va and Vb so that Vt = Va+Vb. The volume Va contains na moles of an ideal gas a and Vb contains nb moles of gas b. The total amount is nt = na+nb. The temperature and pressure in the two volumes is the same. The entropy at the start is given by

When the division between the two gases is removed the two gases expand, comparable to a Joule-Thomson expansion. In the final state the temperature is the same as initially but the two gases now both take the volume Vt. The relation of the entropy of n moles an ideal gas is

with CV the molar heat capacity at constant volume and R the molar ideal gas constant. The system is an adiabatic closed system, so the entropy increase during the mixing of the two gases is equal to the entropy production. It is given by

As the initial and final temperature are the same the temperature terms plays no role, so we can focus on the volume terms. The result is

Introducing the concentration x = na/nt = Va/Vt we arrive at the well known expression

Joule expansion

The Joule expansion is similar to the mixing described above. It takes place in an adiabatic system consisting of a gas and two rigid vessels (a and b) of equal volume, connected by a valve. Initially, the valve is closed. Vessel (a) contains the gas under high pressure while the other vessel (b) is empty. When the valve is opened the gas flows from vessel (a) into (b) until the pressures in the two vessels are equal. The volume, taken by the gas, is doubled while the internal energy of the system is constant (adiabatic and no work done). Assuming that the gas is ideal the molar internal energy is given by Um = CVT. As CV is constant, constant U means constant T. The molar entropy of an ideal gas, as function of the molar volume Vm and T, is given by

The system, of the two vessels and the gas, is closed and adiabatic, so the entropy production during the process is equal to the increase of the entropy of the gas. So, doubling the volume with T constant, gives that the entropy production per mole gas is

Microscopic interpretation

The Joule expansion gives a nice opportunity to explain the entropy production in statistical mechanical (microscopic) terms. At the expansion, the volume that the gas can occupy is doubled. That means that, for every molecule there are now two possibilities: it can be placed in container a or in b. If we have one mole of gas the number of molecules is equal to Avogadro's number NA. The increase of the microscopic possibilities is a factor 2 per molecule so in total a factor 2NA. Using the well-known Boltzmann expression for the entropy

with k Boltzmann's constant and Ω the number of microscopic possibilities to realize the macroscopic state, gives

So, at an irreversible process, the number of microscopic possibilities to realize the macroscopic state is increased by a certain factor.

Basic inequalities and stability conditions

In this section we derive the basic inequalities and stability conditions for closed systems. For closed systems the first law reduces to

The second law we write as

For adiabatic systems so dS/dt ≥ 0. In other words: the entropy of adiabatic systems cannot decrease. In equilibrium the entropy is at its maximum. Isolated systems are a special case of adiabatic systems, so this statement is also valid for isolated systems.

Now consider systems with constant temperature and volume. In most cases T is the temperature of the surroundings with which the system is in good thermal contact. Since V is constant the first law gives . Substitution in the second law, and using that T is constant, gives

With the Helmholtz free energy, defined as

we get

If P = 0 this is the mathematical formulation of the general property that the free energy of systems with fixed temperature and volume tends to a minimum. The expression can be integrated from the initial state i to the final state f resulting in

where WS is the work done by the system. If the process inside the system is completely reversible the equality sign holds. Hence the maximum work, that can be extracted from the system, is equal to the free energy of the initial state minus the free energy of the final state.

Finally we consider systems with constant temperature and pressure and take P = 0. As p is constant the first laws gives

Combining with the second law, and using that T is constant, gives

With the Gibbs free energy, defined as

we get

Homogeneous systems

In homogeneous systems the temperature and pressure are well-defined and all internal processes are reversible. Hence . As a result the second law, multiplied by T, reduces to

With P=0 the first law becomes

Eliminating and multiplying with dt gives

Since

with Gm the molar Gibbs free energy and μ the molar chemical potential we obtain the well-known result

Entropy production in stochastic processes

Since physical processes can be described by stochastic processes, such as Markov chains and diffusion processes, entropy production can be defined mathematically in such processes.[6]

For a continuous-time Markov chain with instantaneous probability distribution and transition rate , the instantaneous entropy production rate is

The long-time behavior of entropy production is kept after a proper lifting of the process. This approach provides a dynamic explanation for the Kelvin statement and the Clausius statement of the second law of thermodynamics.[7]

See also

References

  1. ^ S.R. de Groot and P. Mazur, Non-equilibrium thermodynamics (North-Holland Publishing Company, Amsterdam-London, 1969)
  2. ^ S. Carnot Reflexions sur la puissance motrice du feu Bachelier, Paris, 1824
  3. ^ Clausius, R. (1854). "Ueber eine veränderte Form des zweiten Hauptsatzes der mechanischen Wärmetheoriein". Annalen der Physik und Chemie. 93 (12): 481–506. doi:10.1002/andp.18541691202. Retrieved 25 June 2012.. Clausius, R. (August 1856). "On a Modified Form of the Second Fundamental Theorem in the Mechanical Theory of Heat". Phil. Mag. 4. 12 (77): 81–98. doi:10.1080/14786445608642141. Retrieved 25 June 2012.
  4. ^ R. Clausius Über verschiedene für die Anwendung bequeme Formen der Hauptgleigungen der mechanische Wärmetheorie in Abhandlungen über die Anwendung bequeme Formen der Haubtgleichungen der mechanischen Wärmetheorie Ann.Phys. [2] 125, 390 (1865). This paper is translated and can be found in: The second law of thermodynamics, Edited by J. Kestin, Dowden, Hutchinson, & Ross, Inc., Stroudsburg, Pennsylvania, pp. 162-193.
  5. ^ A.T.A.M. de Waele, Basic operation of cryocoolers and related thermal machines, Review article, Journal of Low Temperature Physics, Vol.164, pp. 179-236, (2011), DOI: 10.1007/s10909-011-0373-x.
  6. ^ Jiang, Da-Quan; Qian, Min; Qian, Min-Ping (2004). Mathematical theory of nonequilibrium steady states: on the frontier of probability and dynamical systems. Berlin: Springer. ISBN 978-3-540-40957-1.
  7. ^ Wang, Yue; Qian, Hong (2020). "Mathematical Representation of Clausius' and Kelvin's Statements of the Second Law and Irreversibility". Journal of Statistical Physics. 179 (3): 808–837. arXiv:1805.09530. doi:10.1007/s10955-020-02556-6.

Further reading