next up previous index
Next: Gambler's Ruin 10/21 Up: Conditional Probability Previous: Example of computation for

Joint Distributions 10/19, 10/20

Examples:
Suppose we consider three tosses of a coin, associating a 1 to heads and 0 to tails each time, and call Xi the random variable that results from trial i, then we could consider the random vector $\tilde{X}=(X_1,X_2,X_3)$ that describes the three tosses. The state space for $\tilde{X}$ is $\Omega=\{0,1\}\times\{0,1\}\times\{0,1\}$, and we can compute the probability distribution on this space as products of the individual coordinates' distributions because the random variables Xi are independent.

Here is an example based on the same experiment but the random variables are different and are not independent: Let Yi be the number of heads up to and including the ith toss: $\tilde{Y}=(Y_1,Y_2,Y_3)$, the state space or sample space for $\tilde{Y}$ is the same as that of $\tilde{X}$, however there are some triplets that are impossible. For instance P(Y2=0|Y1=1)=0, the coordinate random variables are not independent and we have to give the distribution of all the vectors one by one because we cannot build them up from the marginals.

Example:
In the example on colorblindedness, suppose I consider the binary random variables associated to color blindness and gender (associate 0 if male, 1 if female), these are called indicator variables, we can tabulate the probabilities of all 4 possible pairs of outcomes as:
\begin{array}{ll\vert ll\vert l}
&& Male & Female & Total\\
&Gender & 0 & 1 \\ ...
...\frac{494}{512}\\
\hline
Total &&\frac{1}{2}&\frac{1}{2}&\\
\hline
\end{array}

So that from this table of joint distribution we read:

\begin{displaymath}P(colorblind\; and\; male)=P(0,1)=\frac{16}{512}\end{displaymath}

In general, when we build the joint distribution of two random variables we can make such a two-way table, of course, for more variables this is impossible.

Definition:
In the case of two random variables X and Y we define the joint probability mass function of X and Y as :

\begin{displaymath}p(x,y)=P\{X=x, Y=y \}\end{displaymath}

   

The row-sums and column-sums produce the complete distribution functions for the coordinate random variables, they are called the    marginal probabilities, here for instance we have:

\begin{displaymath}P(colorblind)= \frac{16}{512} + \frac{2}{512}=
\frac{18}{512}\; and \; P(not\; colorblind)= \frac{494}{512}\end{displaymath}

In general, given the joint distribution on the pairs (x,y) for two random variables X and Y: P(x,y) we have the marginal distributions

\begin{displaymath}P_X(x)=\sum_{y: P(x,y)>0} P(x,y) \qquad
and
\qquad P_Y(y)=\sum_{x: P(x,y)>0} P(x,y) \end{displaymath}



 
next up previous index
Next: Gambler's Ruin 10/21 Up: Conditional Probability Previous: Example of computation for
Susan Holmes
1998-12-07