next up previous index
Next: Special Lecture:11/18: Persi Diaconis Up: Expectations and Variances for Previous: Properties

Variance 11/17

For a continuous random variable defined with a density f, and expectation $\mu$, its variance is:

\begin{displaymath}Var[X]=E((X-\mu)^2)=\int_{-\infty}^{\infty} (x-\mu)^2 f(x) dx\end{displaymath}

Proposition 1:


Proposition 2:

Var(aX)=a2 Var(X)

Proposition 3:
Only for independent random variables do we have

\begin{displaymath}If\;\; X\; and \; Y\; independent\;
Var(X+Y)=Var(X) + Var(Y)


For the exponential random variable with parameter $\lambda$, I showed:


For U a random uniform on [0,1], I showed: $var(U)=\frac{1}{12}$.


\begin{displaymath}Z \sim Normal(0,1) Var(Z)=1\end{displaymath}

For the Normal $(\mu,\sigma^2)$ random variable X:


This is the meaning of the second parameter in the definiton of the density, its square root is called the standard deviation and gives the width of the curve.

I explained in class what standardizing a variable meant in general, not just for Normals.

Susan Holmes