Jump to content

Pinsker's inequality: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
No edit summary
corrected the factor of 2 to 1/2. This is the correct constant
Line 1: Line 1:
In [[information theory]], '''Pinsker's inequality''', named after its inventor [[Mark Semenovich Pinsker]], is an [[inequality (mathematics)|inequality]] that relates [[Kullback-Leibler divergence]] and the [[total variation distance]]. It states that if ''P'', ''Q'' are two [[probability distribution]]s, then
In [[information theory]], '''Pinsker's inequality''', named after its inventor [[Mark Semenovich Pinsker]], is an [[inequality (mathematics)|inequality]] that relates [[Kullback-Leibler divergence]] and the [[total variation distance]]. It states that if ''P'', ''Q'' are two [[probability distribution]]s, then


: <math>\sqrt{2 D(P\|Q)} \ge \sup \{ |P(A) - Q(A)| : A\text{ is an event to which probabilities are assigned.} \}</math>
: <math>\sqrt{\frac{1}{2} D(P\|Q)} \ge \sup \{ |P(A) - Q(A)| : A\text{ is an event to which probabilities are assigned.} \}</math>


where ''D''(''P''&nbsp;||&nbsp;''Q'') is the [[Kullback-Leibler divergence]] in [[Nat (information)|nats]] and
where ''D''(''P''&nbsp;||&nbsp;''Q'') is the [[Kullback-Leibler divergence]] in [[Nat (information)|nats]] and

Revision as of 21:45, 5 November 2012

In information theory, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker, is an inequality that relates Kullback-Leibler divergence and the total variation distance. It states that if P, Q are two probability distributions, then

where D(P || Q) is the Kullback-Leibler divergence in nats and

is the total variation distance between P and Q.

References

  • Thomas M. Cover and Joy A. Thomas: Elements of Information Theory, 2nd edition, Willey-Interscience, 2006
  • Nicolo Cesa-Bianchi and Gábor Lugosi: Prediction, Learning, and Games, Cambridge University Press, 2006