This is an old revision of this page, as edited by Fvultier(talk | contribs) at 10:33, 5 October 2018(Using refpage template). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
Revision as of 10:33, 5 October 2018 by Fvultier(talk | contribs)(Using refpage template)
where and are particular values of and , respectively, is the joint probability of these values occurring together, and is defined to be 0 if .
For more than two random variables this expands to
where are particular values of , respectively, is the probability of these values occurring together, and is defined to be 0 if .
Properties
Nonnegativity
The joint entropy of a set of random variables is a nonnegative number.
Greater than individual entropies
The joint entropy of a set of variables is greater than or equal to all of the individual entropies of the variables in the set.
Less than or equal to the sum of individual entropies
The joint entropy of a set of variables is less than or equal to the sum of the individual entropies of the variables in the set. This is an example of subadditivity. This inequality is an equality if and only if and are statistically independent.[2]: 30
The above definition is for discrete random variables and no more valid in the case of continuous random variables. The continuous version of discrete joint entropy is called joint differential (or continuous) entropy. Let and be a continuous random variables with a joint probability density function. The differential joint entropy is defined as
.
For more than two continuous random variables the definition is generalized to:
.
The integral is taken over the support of . It is possible that the integral does not exist in which case we say that the differential entropy is not defined.
Properties
As in the discrete case the joint differential entropy of a set of random variables is smaller or equal than the sum of the entropies of the individual random variables:
The following chain rule holds for two random variables:
In the case of more than two random variables this generalizes to:
Joint differential entropy is also used in the definition of the mutual information between continuous random variables:
References
^Theresa M. Korn; Korn, Granino Arthur. Mathematical Handbook for Scientists and Engineers: Definitions, Theorems, and Formulas for Reference and Review. New York: Dover Publications. ISBN0-486-41147-8.
^ abThomas M. Cover; Joy A. Thomas. Elements of Information Theory. Hoboken, New Jersey: Wiley. ISBN0-471-24195-4.