next up previous index
Next: Conditional Expectation Up: Expectation and Variance 11/6 Previous: Conditional Expectation 11/9

   
Variance 11/10

E[X] does not say anything about the the spread of the values.

This is measured by

\begin{displaymath}Variance(X)=E[(X-\mu)^2],\mbox{ where }\mu=E[X]\end{displaymath}

which can also be written in the computational formula:

\begin{displaymath}Var(X)=E[X^2]-(E[X])^2 \mbox{often noted } \sigma^2\end{displaymath}

The unit in which this is measured is not coherent with that of X, we very often use the standard deviation

\begin{displaymath}SD(X)=\sqrt{Var(X)}=\sigma\end{displaymath}


\begin{displaymath}Var(aX+b)=a^2Var(X), \qquad SD(aX+b)=\vert a\vert SD(X)\end{displaymath}

For the Bernouill(p): X2=X

var(X)=E(X2)-(E(X))2=p-p2=p(1-p)=pq

Property 1:
For two independent random variables, X and Y:

Var(X+Y)=Var(X)+Var(Y)

This essential property allowed us to compute the variance of a binomial Sn, because we can write a binomial as the sum of n independent Bernouilli(p) random variables Xi so that:

\begin{displaymath}var(S_n)=var(\sum_i X_i)= n var(X_i)=npq\end{displaymath}

Example(which I `did' in class!):
What is the variance of the geometric? Using the computational formula, we compute first E(X2):

\begin{eqnarray*}E(X^2)&=&\sum_{j=1}^{+\infty} j^2 P(X=j) =\sum_{j=1}^{+\infty} ...
...rac{1}{p}+ \frac{2q}{p^2}-
\frac{1}{p^2}\\
&=& \frac{q}{p^2}\\
\end{eqnarray*}


From the sums of independent variables theorem, and the fact that a Negative Binomial Yr can be written as the sum of r independent Geometrics, we have:

\begin{displaymath}var(Y_r)=\frac{rq}{p^2}\end{displaymath}

I also showed in class, that the variance of the Poisson($\lambda$) random variable is $\lambda$, a fact that helps recognize a Poisson random variable.



 
next up previous index
Next: Conditional Expectation Up: Expectation and Variance 11/6 Previous: Conditional Expectation 11/9
Susan Holmes
1998-12-07