web.archive.org

joint distribution: Definition and Much More from Answers.com

In the study of probability, given two random variables X and Y, the joint distribution of X and Y is the distribution of X and Y together.

The discrete case

For discrete random variables, the joint probability mass function can be written as Pr(X = x & Y = y). This is

P(X=x\ \mathrm{and}\ Y=y) = P(Y=y|X=x)P(X=x)= P(X=x|Y=y)P(Y=y).\;

Since these are probabilities, we have

\sum_x \sum_y P(X=x\ \mathrm{and}\ Y=y) = 1.\;

The continuous case

Similarly for continuous random variables, the joint probability density function can be written as fX,Y(xy) and this is

fX,Y(x,y) = fY | X(y | x)fX(x) = fX | Y(x | y)fY(y)

where fY|X(y|x) and fX|Y(x|y) give the conditional distributions of Y given X = x and of X given Y = y respectively, and fX(x) and fY(y) give the marginal distributions for X and Y respectively.

Again, since these are probability distributions, one has

\int_x \int_y f_{X,Y}(x,y) \; dy \; dx= 1.

Joint distribution of independent variables

If for discrete random variables \ P(X = x \ \mbox{and} \ Y = y ) = P( X = x) \cdot P( Y = y) for all x and y, or for continuous random variables \ p_{X,Y}(x,y) = p_X(x) \cdot p_Y(y) for all x and y, then X and Y are said to be independent.

Multidimensional distributions

The joint distribution of two random variables can be extended to many random variables X1, ..., Xn by adding them sequentially with the identity

f_{X_1, \ldots, X_n}(x_1, \ldots, x_n) = f_{X_n | X_1, \ldots, X_{n-1}}( x_n | x_1, \ldots, x_{n-1}) f_{X_1, \ldots, X_{n-1}}( x_1, \ldots, x_{n-1} ) .

See also

External links

This entry is from Wikipedia, the leading user-contributed encyclopedia. It may not have been reviewed by professional editors (see full disclaimer)