Joint entropy: Difference between revisions
No edit summary |
added a reference |
||
Line 1: | Line 1: | ||
{{unreferenced|date=August 2006}} |
|||
The '''joint entropy''' is an [[information entropy|entropy measure]] used in [[information theory]]. The joint entropy measures how much [[entropy (information theory)|entropy]] is contained in a joint system of two [[random variables]]. If the random variables are <math>X</math> and <math>Y</math>, the joint entropy is written <math>H(X,Y)</math>. Like other entropies, the joint entropy is measured in [[bit]]s. |
The '''joint entropy''' is an [[information entropy|entropy measure]] used in [[information theory]]. The joint entropy measures how much [[entropy (information theory)|entropy]] is contained in a joint system of two [[random variables]]. If the random variables are <math>X</math> and <math>Y</math>, the joint entropy is written <math>H(X,Y)</math>. Like other entropies, the joint entropy is measured in [[bit]]s. |
||
Line 54: | Line 53: | ||
In [[quantum information theory]], the joint entropy is generalized into the [[joint quantum entropy]]. |
In [[quantum information theory]], the joint entropy is generalized into the [[joint quantum entropy]]. |
||
==References== |
|||
# {{cite book |author=Theresa M. Korn; Korn, Granino Arthur |title=Mathematical Handbook for Scientists and Engineers: Definitions, Theorems, and Formulas for Reference and Review |publisher=Dover Publications |location=New York |year= |pages=613-614 |isbn=0-486-41147-8 |oclc= |doi=}} |
|||
[[Category:Entropy and information]] |
[[Category:Entropy and information]] |
Revision as of 04:48, 5 July 2007
The joint entropy is an entropy measure used in information theory. The joint entropy measures how much entropy is contained in a joint system of two random variables. If the random variables are and , the joint entropy is written . Like other entropies, the joint entropy is measured in bits.
Background
Given a random variable , the entropy describes our uncertainty about the value of . If consists of several events , which each occur with probability , then the entropy of is
Consider another random variable , containing events occurring with probabilities . has entropy .
However, if and describe related events, the total entropy of the system may not be . For example, imagine we choose an integer between 1 and 8, with equal probability for each integer. Let represent whether the integer is even, and represent whether the integer is prime. One-half of the integers between 1 and 8 are even, and one-half are prime, so . However, if we know that the integer is even, there is only a 1 in 4 chance that it is also prime; the distributions are related. The total entropy of the system is less than 2 bits. We need a way of measuring the total entropy of both systems.
Definition
We solve this by considering each pair of possible outcomes . If each pair of outcomes occurs with probability , the joint entropy is defined as
In the example above the joint entropy becomes bits.
Properties
Greater than subsystem entropies
The joint entropy is always at least equal to the entropies of the original system; adding a new system can never reduce the available uncertainty.
This inequality is an equality if and only if is a (deterministic) function of .
if is a (deterministic) function of , we also have
Subadditivity
Two systems, considered together, can never have more entropy than the sum of the entropy in each of them. This is an example of subadditivity.
This inequality is an equality if and only if and are statistically independent.
Bounds
Like other entropies, always.
Relations to Other Entropy Measures
The joint entropy is used in the definitions of the conditional entropy:
and the mutual information:
In quantum information theory, the joint entropy is generalized into the joint quantum entropy.
References
- Theresa M. Korn; Korn, Granino Arthur. Mathematical Handbook for Scientists and Engineers: Definitions, Theorems, and Formulas for Reference and Review. New York: Dover Publications. pp. 613–614. ISBN 0-486-41147-8.
{{cite book}}
: CS1 maint: multiple names: authors list (link)