<< 1.8 Expectation of Random Variable | 1.10 Important Inequalities.md >>
Definition 1.9.1: Mean
Definition
Let
- : Random variable
- Expectation of exists
If
Then we say is the mean value of
Definition 1.9.2: Variance
Definition
Let
- : Random variable, with
- Finite mean
- Finite
If
Then we say is the variance of
Definition: Moments
Definition
In general, if is a positive integer, and if means the -th derivative of , we have, by repeated differentiation with respect to , $M^{(m)}(0) = E(X^m)$$
Now $E(X^m) = \int_{-\infty}^\infty x^m f(x),dx \quad \text{or}\quad \sum_{x}x^m p(x),$$ and in mechanics, the integrals (or sums) of this sort are called moments.
Theorem 1.9.1: Constant multiplication and addition with variance
Let
- : Random variable, with
- Finite
- Finite
- : Constants
Then
Definition 1.9.3: Moment generating function (mgf)
Definition
Let : Random variable
If
Then we say is the moment generating function (mgf) of
Theorem 1.9.2: Uniqueness of mgf
Let
- : Random variables
- : mgf of respectively, existing in some neighborhood of
Then
This theorem states that two random variables have the same distribution if and only if they have the same mgf in some neighborhood of zero.
Remark 1.9.1: Characteristic function
Let
- : Random variable
- : Imaginary unit
If
Then we say is the characteristic function of
Important property of this expectation is that while distributions may not have an mgf, every distribution has a unique characteristic function