Jump to content

Joint entropy

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 80.229.229.117 (talk) at 12:51, 30 September 2010 (Added diagram). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Individual (H(X),H(Y)), joint (H(X,Y)), and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X; Y).

The joint entropy is an entropy measure used in information theory. The joint entropy measures how much entropy is contained in a joint system of two random variables. If the random variables are and , the joint entropy is written . Like other entropies, the joint entropy can be measured in bits, nits, or hartleys depending on the base of the logarithm.

Background

Given a random variable , the entropy describes our uncertainty about the value of . If consists of several events , which each occur with probability , then the entropy of is

Consider another random variable , containing events occurring with probabilities . has entropy .

However, if and describe related events, the total entropy of the system may not be . For example, imagine we choose an integer between 1 and 8, with equal probability for each integer. Let represent whether the integer is even, and represent whether the integer is prime. One-half of the integers between 1 and 8 are even, and one-half are prime, so . However, if we know that the integer is even, there is only a 1 in 4 chance that it is also prime; the distributions are related. The total entropy of the system is less than 2 bits. We need a way of measuring the total entropy of both systems.

Definition

We solve this by considering each pair of possible outcomes . If each pair of outcomes occurs with probability , the joint entropy is defined as

In the example above we are not considering 1 as a prime. Then the joint probability distribution becomes:

Thus, the joint entropy is

bits.

Properties

Greater than subsystem entropies

The joint entropy is always at least equal to the entropies of the original system; adding a new system can never reduce the available uncertainty.

This inequality is an equality if and only if is a (deterministic) function of .

if is a (deterministic) function of , we also have


Subadditivity

Two systems, considered together, can never have more entropy than the sum of the entropy in each of them. This is an example of subadditivity.

This inequality is an equality if and only if and are statistically independent.

Bounds

Like other entropies, .

Continuous case

The above discussion concerns discrete variables, but (as with univariate entropy) the analogous definition can be applied to continuous variables. The differential entropy of a continuous joint distribution is defined as above but using the integral over the 2D parameter space rather than the summation, and shares many of the same properties.

Relations to Other Entropy Measures

The joint entropy is used in the definitions of the conditional entropy:

and the mutual information:

In quantum information theory, the joint entropy is generalized into the joint quantum entropy.

References

  1. Theresa M. Korn; Korn, Granino Arthur. Mathematical Handbook for Scientists and Engineers: Definitions, Theorems, and Formulas for Reference and Review. New York: Dover Publications. pp. 613–614. ISBN 0-486-41147-8.{{cite book}}: CS1 maint: multiple names: authors list (link)