next up previous index
Next: Conditionning with continuous random Up: Joint Distributions 10/19, 10/20 Previous: Gambler's Ruin 10/21

Joint Probability Distribution for n variables 10/21

Example of the multinomial

Sequence of identical experiments, each outcome one of r possible ones, with probabilities $p_1,p_2,p_3,\ldots,p_r,\sum p_i=1$. Denote by Xi the number of the outcomes that result in i. The joint distribution of $(X_1,X_2,\ldots,X_r)$ is the multinomial, the Xi's are not indpendent.

  Mutually Independent events:
The events $\{A_1,A_2,\ldots A_n\}$ are said to be mutually independent iff for every subset of events $\{A_i,A_j,\ldots A_r\}$, we have the multiplicative property of intersection:

\begin{displaymath}P(A_i\cap A_j\cap \cdots \cap A_r)=P(A_i)P(A_j)\ldots P(A_r)\end{displaymath}

Mutually Independent random variables:
The random variables $X_1, X_2,\ldots X_n$ are said to be mutually independent iff for any vector $\tilde{r}=(r_1,r_2,\ldots, r_n)$ in the cartesian product state space $\Omega=R_1\times
R_2\times...R_n$, we have:

\begin{displaymath}P(X_1=r_1,X_2=r2,X_2=r_3,\ldots,X_n=r_n)=\Pi_{i=1}^n P(X_i=r_i)\end{displaymath}



Susan Holmes
1998-12-07