next up previous index
Next: Joint Probability Distribution for Up: Joint Distributions 10/19, 10/20 Previous: Joint Distributions 10/19, 10/20

Gambler's Ruin 10/21

Suppose we do a sequence of Bernouill(p) trials, we call them coin tosses, with a p-coin. A wins the toss if the coin comes up heads, and B wins if tails. If A wins he takes 1 from B, if not the inverse happens. The overall game is played until one of the players goes bankrupt.

Call $E_i=\{ A \; wins\; the\; whole\; game\; starting\; with\; i\; \$\}$

Then we do a ``one-step'' analysis, take Pi=P(Ei), we can show that:

\begin{eqnarray*}P_i &= &P(E_i\vert A\mbox{ wins first toss})P(A\mbox{ wins firs...
... & \left( ( \frac{q}{p})^{N-1}+\cdots +\frac{q}{p} \right)P_1\\
\end{eqnarray*}


Now we use the two boundary conditions: $P_0=0 \qquad P_N=1$, and the recurrence above leads to the following conclusions (using the sum of the geometric series of term $\frac{q}{p}$.

\begin{displaymath}P_i=\left\{ \begin{array}{ll}
\frac{1- ( \frac{q}{p})^i}{(1-\...
..._1= \frac{i}{N} &if \; \; \frac{q}{p} =1\\
\end{array}\right.
\end{displaymath}



Susan Holmes
1998-12-07