Jump to content

Detailed balance: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
GrouchoBot (talk | contribs)
No edit summary
Line 12: Line 12:
<!-- -- ie the number of instances in a [[statistical ensemble]] of moving from state ''i'' to state ''j'' is the same as the number moving from state ''j'' to state ''i''. -->
<!-- -- ie the number of instances in a [[statistical ensemble]] of moving from state ''i'' to state ''j'' is the same as the number moving from state ''j'' to state ''i''. -->
Note that the detailed balance condition is stronger than that required merely for a stationary distribution. It applies separately pairwise to each pair of states, so a steady-state probability current A -> B -> C -> A does not suffice.
Note that the detailed balance condition is stronger than that required merely for a stationary distribution. Detailed balance also implies that around any closed cycle of states, there is no net
flow of probability.

:<math> \forall a,b,c, P(a,b) P(b,c) P(c,a) = P(a,c) P(c,b) P(b,a) </math>

When a Markov process is reversible, its dynamics can be described in terms of an entropy function that act like a potential, in that the entropy of the process is always increasing, and reaches it's minimum at the stationary distribution.


Detailed balance is a weaker condition than requiring the transition matrix to be symmetric, ''P<sub>ij</sub>'' = ''P<sub>ji</sub>''. That would imply that the uniform distribution over the states would automatically be an equilibrium distribution. However, for continuous systems it may be possible to continuously transform the co-ordinates until a uniform metric is the equilibrium distribution, with a transition kernel which then is symmetric. In the discrete case it may be possible to achieve something similar, by breaking the Markov states into a degeneracy of sub-states.
Detailed balance is a weaker condition than requiring the transition matrix to be symmetric, ''P<sub>ij</sub>'' = ''P<sub>ji</sub>''. That would imply that the uniform distribution over the states would automatically be an equilibrium distribution. However, for continuous systems it may be possible to continuously transform the co-ordinates until a uniform metric is the equilibrium distribution, with a transition kernel which then is symmetric. In the discrete case it may be possible to achieve something similar, by breaking the Markov states into a degeneracy of sub-states.

Revision as of 17:15, 5 May 2009

In mathematics and statistical mechanics, a Markov process is said to show detailed balance if the transition rates between each pair of states i and j in the state space obey

where P is the Markov transition matrix (transition probability), ie Pij = P( Xt =j | Xt−1 = i ); and and are the equilibrium probabilities of being in states i and j, respectively.

The definition carries over straightforwardly to continuous variables, where becomes a probability density, and P a transition kernel:

A Markov process that satisfies the detailed balance equations is said to be a reversible Markov process or reversible Markov chain with respect to .

Note that the detailed balance condition is stronger than that required merely for a stationary distribution. Detailed balance also implies that around any closed cycle of states, there is no net flow of probability.

When a Markov process is reversible, its dynamics can be described in terms of an entropy function that act like a potential, in that the entropy of the process is always increasing, and reaches it's minimum at the stationary distribution.

Detailed balance is a weaker condition than requiring the transition matrix to be symmetric, Pij = Pji. That would imply that the uniform distribution over the states would automatically be an equilibrium distribution. However, for continuous systems it may be possible to continuously transform the co-ordinates until a uniform metric is the equilibrium distribution, with a transition kernel which then is symmetric. In the discrete case it may be possible to achieve something similar, by breaking the Markov states into a degeneracy of sub-states.

Such an invariance is a supporting justification for the principle of equal a-priori probability in statistical mechanics.

See also