In the study of probability, given two random variables X and Y that are defined on the same probability space, the joint distribution for X and Y defines the probability of events defined in terms of both X and Y. In the case of only two random variables, this is called a bivariate distribution, but the concept generalizes to any number of random variables, giving a multivariate distribution.
Example
Consider the roll of a die and let if the number is even and else. Furthermore, let if the number is prime and else. Then, the joint distribution of and is
Cumulative distribution
The cumulative distribution function for a pair of random variables is defined in terms of their joint probability distribution;
Discrete case
The joint probability mass function of two discrete random variables is equal to
In general, the joint probability distribution of discrete random variables is equal to
This identity is known as the chain rule of probability.
Since these are probabilities, we have
Continuous case
Similarly for continuous random variables, the joint probability density function can be written as fX,Y(x, y) and this is
where fY|X(y|x) and fX|Y(x|y) give the conditional distributions of Y given X = x and of X given Y = y respectively, and fX(x) and fY(y) give the marginal distributions for X and Y respectively.
Again, since these are probability distributions, one has
Mixed case
In some situations X is continuous but Y is discrete. For example, in a logistic regression, one may wish to predict the probability of a binary outcome Y conditional on the value of a continuously-distributed X. In this case, (X, Y) has neither a probability density function nor a probability mass function in the sense of the terms given above. On the other hand, a "mixed joint density" can be defined in either of two ways:
Formally, fX,Y(x, y) is the probability density function of (X, Y) with respect to the product measure on the respective supports of X and Y. Either of these two decompositions can then be used to recover the joint cumulative distribution function:
The definition generalizes to a mixture of arbitrary numbers of discrete and continuous random variables.
General multidimensional distributions
The cumulative distribution function for a vector of random variables is defined in terms of their joint probability distribution;
The joint distribution for two random variables can be extended to many random variables X1, ... Xn by adding them sequentially with the identity
where
and
(notice, that these latter identities can be useful to generate a random variable with given distribution function ); the density of the marginal distribution is
The joint cumulative distribution function is
and the conditional distribution function is accordingly
Expectation reads
suppose that h is smooth enough and for , then, by iterated integration by parts,
Joint distribution for independent variables
If for discrete random variables for all x and y, or for absolutely continuous random variables for all x and y, then X and Y are said to be independent.
Joint Distribution for conditionally independent variables
If a subset of the variables is conditionally independent given another subset of these variables, then the joint distribution is equal to . Therefore, it can be efficiently represented by the lower-dimensional probability distributions and . Such conditional independence relations can be represented with a Bayesian network.
See also
External links