next up previous index
Next: Special Lecture:11/18: Persi Diaconis Up: Expectations and Variances for Previous: Properties

Variance 11/17

For a continuous random variable defined with a density f, and expectation $\mu$, its variance is:

\begin{displaymath}Var[X]=E((X-\mu)^2)=\int_{-\infty}^{\infty} (x-\mu)^2 f(x) dx\end{displaymath}

Proposition 1:

Var(X+b)=Var(X)

Proposition 2:

Var(aX)=a2 Var(X)

Proposition 3:
Only for independent random variables do we have

\begin{displaymath}If\;\; X\; and \; Y\; independent\;
Var(X+Y)=Var(X) + Var(Y)
\end{displaymath}

Examples:

1.
For the exponential random variable with parameter $\lambda$, I showed:

\begin{displaymath}var(X)=\frac{1}{\lambda^2}\end{displaymath}

2.
For U a random uniform on [0,1], I showed: $var(U)=\frac{1}{12}$.

3.

\begin{displaymath}Z \sim Normal(0,1) Var(Z)=1\end{displaymath}

For the Normal $(\mu,\sigma^2)$ random variable X:

\begin{displaymath}Var(X)=\sigma^2\end{displaymath}

This is the meaning of the second parameter in the definiton of the density, its square root is called the standard deviation and gives the width of the curve.

I explained in class what standardizing a variable meant in general, not just for Normals.



Susan Holmes
1998-12-07