Joint entropy: Difference between revisions
replaced the PNG image with an SVG version |
|||
Line 1: | Line 1: | ||
[[Image: |
[[Image:Entropy-mutual-information-relative-entropy-relation-diagram.svg|thumb|256px|right|Individual (H(X),H(Y)), joint (H(X,Y)), and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X; Y).]] |
||
'''Joint [[entropy (information theory)|entropy]]''' is a measure of the uncertainty associated with a set of [[random variables|variables]]. |
'''Joint [[entropy (information theory)|entropy]]''' is a measure of the uncertainty associated with a set of [[random variables|variables]]. |
Revision as of 15:25, 5 April 2012
Joint entropy is a measure of the uncertainty associated with a set of variables.
Definition
The joint entropy of two variables and is defined as
where and are particular values of and , respectively, is the probability of these values occurring together, and is defined to be 0 if .
For more than two variables this expands to
where are particular values of , respectively, is the probability of these values occurring together, and is defined to be 0 if .
Properties
Greater than individual entropies
The joint entropy of a set of variables is greater than or equal to all of the individual entropies of the variables in the set.
Less than sum of individual entropies
The joint entropy of a set of variables is less than or equal to the sum of the individual entropies of the variables in the set. This is an example of subadditivity. This inequality is an equality if and only if and are statistically independent.
Relations to Other Entropy Measures
Joint entropy is used in the definition of conditional entropy --
-- and mutual information:
In quantum information theory, the joint entropy is generalized into the joint quantum entropy.