Jump to content

Joint probability distribution: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Clarified headings and article flow
Line 122: Line 122:
<!-- If for discrete random variables<sub>''Y''</sub>(''y'') for all ''x'' and ''y'', then ''X'' and ''Y'' are said to be [[statistical independence|independent]]. -->
<!-- If for discrete random variables<sub>''Y''</sub>(''y'') for all ''x'' and ''y'', then ''X'' and ''Y'' are said to be [[statistical independence|independent]]. -->


==Joint distribution for conditionally independent variables==
==Joint distribution for conditionally dependent variables==
If a subset <math>A</math> of the variables <math>X_1,\cdots,X_n</math> is [[conditional independence|conditionally independent]] given another subset <math>B</math> of these variables, then the joint distribution <math>\mathrm{P}(X_1,...,X_n)</math> is equal to <math>P(B)\cdot P(A|B)</math>. Therefore, it can be efficiently represented by the lower-dimensional probability distributions <math>P(B)</math> and <math>P(A|B)</math>. Such conditional independence relations can be represented with a [[Bayesian network]].
If a subset <math>A</math> of the variables <math>X_1,\cdots,X_n</math> is [[conditional dependence|conditionally dependent]] given another subset <math>B</math> of these variables, then the joint distribution <math>\mathrm{P}(X_1,...,X_n)</math> is equal to <math>P(B)\cdot P(A|B)</math>. Therefore, it can be efficiently represented by the lower-dimensional probability distributions <math>P(B)</math> and <math>P(A|B)</math>. Such conditional independence relations can be represented with a [[Bayesian network]].


==See also==
==See also==

Revision as of 06:47, 15 November 2012

In the study of probability, given two random variables X and Y that are defined on the same probability space, the joint distribution for X and Y defines the probability of events defined in terms of both X and Y. In the case of only two random variables, this is called a bivariate distribution, but the concept generalizes to any number of random variables, giving a multivariate distribution. The equation for joint probability is different for both dependent and independent events.

The joint probability function of a set of variables can be used to find a variety of other probability distributions. The probability density function can be found by taking a partial derivative of the joint distribution with respect to each of the variables. A marginal density ("marginal distribution" in the discrete case) is found by integrating (or summing in the discrete case) over the domain of one of the other variables in the joint distribution. A conditional probability distribution can be calculated by taking the joint density and dividing it by the marginal density of one (or more) of the variables.

Example

Consider the roll of a die and let if the number is even (i.e.; 2,4, or 6) and otherwise. Furthermore, let if the number is prime (i.e.; 2, 3 or 5) and otherwise. Then, the joint distribution of and is

Cumulative distribution

The cumulative distribution function for a pair of random variables is defined in terms of their joint probability distribution;

where our terms are defined such that...

Discrete case

The joint probability mass function of two discrete random variables is equal to

In general, the joint probability distribution of discrete random variables is equal to

This identity is known as the chain rule of probability.

Since these are probabilities, we have

generalizing for discrete random variables

Continuous case

Similarly for continuous random variables, the joint probability density function can be written as fX,Y(xy) and this is

where fY|X(y|x) and fX|Y(x|y) give the conditional distributions of Y given X = x and of X given Y = y respectively, and fX(x) and fY(y) give the marginal distributions for X and Y respectively.

Again, since these are probability distributions, one has

Mixed case

In some situations X is continuous but Y is discrete. For example, in a logistic regression, one may wish to predict the probability of a binary outcome Y conditional on the value of a continuously distributed X. In this case, (X, Y) has neither a probability density function nor a probability mass function in the sense of the terms given above. On the other hand, a "mixed joint density" can be defined in either of two ways:

Formally, fX,Y(x, y) is the probability density function of (X, Y) with respect to the product measure on the respective supports of X and Y. Either of these two decompositions can then be used to recover the joint cumulative distribution function:

The definition generalizes to a mixture of arbitrary numbers of discrete and continuous random variables.

General multidimensional distributions

Remember that the cumulative distribution function for a vector of random variables is defined in terms of their joint probability distribution;


The joint distribution for two random variables can be extended to many random variables X1, ... Xn by adding them sequentially with the identity

where

and

(notice, that these latter identities can be useful to generate a random variable with given distribution function ); the density of the marginal distribution is

The joint cumulative distribution function is

and the conditional distribution function is accordingly


Expectation reads

suppose that h is smooth enough and for , then, by iterated integration by parts,

Joint distribution for independent variables

If for discrete random variables for all x and y, or for absolutely continuous random variables for all x and y, then X and Y are said to be independent.

Joint distribution for conditionally dependent variables

If a subset of the variables is conditionally dependent given another subset of these variables, then the joint distribution is equal to . Therefore, it can be efficiently represented by the lower-dimensional probability distributions and . Such conditional independence relations can be represented with a Bayesian network.

See also

  • "Joint continuous density function". PlanetMath.
  • Mathworld: Joint Distribution Function