next up previous index
Next: Pascal's rationalisation of religion Up: Discrete Random Variable 11/6 Previous: Discrete Random Variable 11/6

Properties of Expectation 11/6

In general for any function of the random variable we have the formula:

\begin{displaymath}E[g(X)]=\sum_{i}g(x_i) \times m(x_i) \qquad .\end{displaymath}


\begin{displaymath}E[aX+b]=aE[X]+b,\mbox{ where a and b are real valued constants}\end{displaymath}

Beware:
If we need to take a function of a random variable, we cannot in general say that the expectation of the function is the function of the expectation, look for instance at the function g(X)=X2, for the random variable X which takes on -1 with probability 1/5, +1 with probability .3, and 0 with probability 1/2.

For this random variable E[X]=.1 and E[X2]=.5 which is NOT the same as .01=(E[X])2.

  We define the kth moment as

\begin{displaymath}E[X^k]=\sum x^k p(x)\end{displaymath}

For any two random variables X and Y, we have:

\begin{displaymath}E(X+Y)=E(X)+E(Y) \qquad\end{displaymath}

Only for independent random variables do we have

\begin{displaymath}If\;\; X\; and \; Y\; independent\;
E(XY)=E(X)E(Y)
\end{displaymath}



Susan Holmes
1998-12-07