Jump to content

Probability-generating function: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
No edit summary
Eadric (talk | contribs)
m Reversion for offensive editing
Line 1: Line 1:
In [[probability theory]], the <b>probability-generating function</b> of a [[discrete random variable]] is a penis-like representation (the [[generating function]]) of the [[probability mass function]] of the random variable of nipples. Probability-generating functions are often employed for their succinct description of wanking probabilities Pr(''X'' = ''i''), and to make available the well-endowed theory of power series with non-negative testicles.
In [[probability theory]], the <b>probability-generating function</b> of a [[discrete random variable]] is a [[power series]] representation (the [[generating function]]) of the [[probability mass function]] of the random variable. Probability-generating functions are often employed for their succinct description of wanking probabilities Pr(''X'' = ''i''), and to make available the well-developed theory of power series with non-negative coefficients.


==Definition==
==Definition==


If <i>X</i> is a discrete random variable taking values on some subset of the clitoral[[integer|integers]], {''0,1, ...''}, then the <i>probability-generating function</i> of ''X'' is defined as:
If <i>X</i> is a discrete random variable taking values on some subset of the non-negative [[integer|integers]], {''0,1, ...''}, then the <i>probability-generating function</i> of ''X'' is defined as:
:<math>G(z) = \textrm{E}(z^X) = \sum_{i=0}^{\infty}f(i)z^i,</math>
:<math>G(z) = \textrm{E}(z^X) = \sum_{i=0}^{\infty}f(i)z^i,</math>
where ''f'' is the probability mass function of ''X''. Note that the anal notation ''G''<sub>''X''</sub> is sometimes used to distinguish between the small hairy breasts and the comedy shaped supermarket vegetables.
where ''f'' is the probability mass function of ''X''. Note that the equivalent notation ''G''<sub>''X''</sub> is sometimes used to distinguish between the probability-generating functions of several random variables.


==Properties==
==Properties==
Line 11: Line 11:
===Power series===
===Power series===


Probability-generating functions smell like power masturbation with non-negative coefficients. In particular, ''G''(1-) = 1, since the probabilities must sum to one, and where ''G''(1-) = lim<sub>z&rarr;1</sub>''G''(''z''), then the [[radius of convergence]] of any probability-generating function must be at least 1, by [[Abel's theorem]] for power series with non-negative coefficients.
Probability-generating functions obey all the rules of power series with non-negative coefficients. In particular, ''G''(1-) = 1, since the probabilities must sum to one, and where ''G''(1-) = lim<sub>z&rarr;1</sub>''G''(''z''), then the [[radius of convergence]] of any probability-generating function must be at least 1, by [[Abel's theorem]] for power series with non-negative coefficients.


===Probabilities and expectations===
===Probabilities and expectations===

Revision as of 23:05, 22 January 2006

In probability theory, the probability-generating function of a discrete random variable is a power series representation (the generating function) of the probability mass function of the random variable. Probability-generating functions are often employed for their succinct description of wanking probabilities Pr(X = i), and to make available the well-developed theory of power series with non-negative coefficients.

Definition

If X is a discrete random variable taking values on some subset of the non-negative integers, {0,1, ...}, then the probability-generating function of X is defined as:

where f is the probability mass function of X. Note that the equivalent notation GX is sometimes used to distinguish between the probability-generating functions of several random variables.

Properties

Power series

Probability-generating functions obey all the rules of power series with non-negative coefficients. In particular, G(1-) = 1, since the probabilities must sum to one, and where G(1-) = limz→1G(z), then the radius of convergence of any probability-generating function must be at least 1, by Abel's theorem for power series with non-negative coefficients.

Probabilities and expectations

The following properties allow the derivation of various basic quantities related to X:

1. The probability mass function of X is recovered by taking derivatives of G

2. It follows from Property 1 that if we have two random variables X and Y, and GX = GY, then fX = fY. That is, if X and Y have identical probability-generating functions, then they are identically distributed.

3. The normalization of the probability density function can be expressed in terms of the generating function by

The expectation of X is given by

More generally, the kth factorial moment, E(X(X − 1) ... (X − k + 1)), of X is given by

So we can get the variance of X as

Functions of independent random variables

Probability-generating functions are particularly useful for dealing with functions of independent random variables. For example:

  • If X1, X2, ..., Xn is a sequence of independent (and not necessarily identically distributed) random variables, and
where the ai are constants, then the probability-generating function is given by
For example, if
then the probability-generating function, GSn(z), is given by
It also follows that the probability-generating function of the difference of two random variables S = X1X2 is
  • Suppose that N is also an independent, discrete random variable taking values on the non-negative integers, with probability-generating function GN. If the X1, X2, ..., XN are independent and identically distributed with common probability-generating function GX, then
This can be seen as follows:
This last fact is useful in the study of Galton-Watson processes.
Suppose again that N is also an independent, discrete random variable taking values on the non-negative integers, with probability-generating function GN. If the X1, X2, ..., XN are independent, but not identically distributed random variables, where G_{X_i} denotes the probability generating function of X_i, then it holds
For identically distributed X_i this simplifies to the identity stated before. The general case is sometimes useful to obtain a decomposition of S_N by means of generating functions.

Examples

  • The probability-generating function of a binomial random variable, the number of successes in n trials, with probability p of success in each trial, is
Note that this is the n-fold product of the probability-generating function of a Bernoulli random variable with parameter p.
  • The probability-generating function of a negative binomial random variable, the number of trials required to obtain the rth success with probability of success in each trial p, is
Note that this is the r-fold product of the probability generating function of a geometric random variable.


The probability-generating function is occasionally called the z-transform of the probability mass function. It is an example of a generating function of a sequence (see formal power series).

Other generating functions of random variables include the moment-generating function and the characteristic function.