This week we will be talking about moments and moment generating functions. This material is also covered in Chapter 3, Section 4 of Evans and Rosenthal.

Monday, January 13

Today we introduced moments.

Definition (Moments)

Let \(X\) be a random variable with mean \(\mu\) and variance \(\sigma^2\).

Moments aren't always defined, for example, in the game from the St. Petersburg paradox the expected value (which is the 1st moment) is infinity.

We also introduced the concept of symmetry for probability distributions.

Definition (Symmetry)

A random variable \(X\) is symmetric about its mean if \(X-\mu\) has the same probability distribution as \(\mu-X\).

Symmetric variables have the following nice property:

Theorem (Symmetry & Moments)

If \(X\) is a random variable that is symmetric about \(\mu\), then all odd central moments of \(X\) are equal to zero.

This suggests that you can tell how skewed a random variable is by looking at odd moments. This leads to the following definition.

Definition (Skewness)

The skewness of a random variable \(X\) is its 3rd standard moment \(\displaystyle E \left(\frac{(X-\mu)^3}{\sigma^3} \right)\).

In-class Problems

We did the following problems in class today.

  1. Find the expected value of the game from the St. Petersburg Paradox.

  2. Calculate the 3rd moment of \(X \sim \text{Norm}(0,1)\). We used WolframAlpha to calculate the integral.

  3. Calculate the k-th moment of \(X \sim \text{Unif}(-1,1)\).

Study Questions

  1. Why are odd central moments always zero when a random variable is symmetric?

  2. What can you say about the skewness of a random variable that is skewed left? What about one that is skewed right?

  3. Try calculating some moments of a few probability distributions. If you get a complicated integral, use WolframAlpha.


Wednesday, January 15

Today we talked about kurtosis and introduced moment generating functions.

Definition (Kurtosis)

The kurtosis of a random variable \(X\) is the 4th standard moment \(\displaystyle E \left( \frac{(X-\mu)^4}{\sigma^4} \right)\).

Kurtosis measures how fat the tails of a distribution are. The kurtosis of a normal random variable is always 3. Sometimes people talk about the excess kurtosis which is the kurtosis minus 3.

Calculating moments gets tedious. Moment generating functions give an alternative approach. Moment generating functions are similar to the probability generating functions that we talked about last semester.

Definition (MGFs & PGFs)

Let \(X\) be a random variable and \(t\) be a dummy variable.


Theorem (Moments & MGFs)

If the moment generating function \(m_X(t)\) is defined for all \(t\) close to zero, then the k-th moment of \(X\) is the k-th derivative of \(m_X(t)\) evaluated at \(t=0\). That is: \[E(X^k) = m_X^{(k)}(0).\]

In-class problems

  1. Find the kurtosis of \(X \sim \text{Norm}(0,1)\). It's okay to use WolframAlpha to calculate the integral.

  2. Find the kurtosis of \(U \sim \text{Unif}(0,1)\). Notice that the kurtosis of a uniform random variable is less than the kurtosis of a normal random variable. Why does that make sense?

  3. Find the moment generating function for the random variable \(X\) which is 0 if a fair coin lands on tails and 1 if it lands on heads.

  4. What is the moment generating function for a six-sided die?

  5. What is the moment generating function for \(Y \sim \text{Exp}(1)\). Use the moment generating function to find the expected value of \(Y\). When we did this one in class, we got the right moment generating function, but when we took the derivative, we got the wrong answer. The derivative at zero should be positive 1, not negative 1.

Study Questions

  1. Why do all normal random variables have the same kurtosis? Why doesn't the kurtosis depend on the mean and the variance?

  2. Why is kurtosis always positive? Is the same thing true for skewness? Why not?


Friday, January 17

We started class by proving the theorem that says that the k-th moment of a random variable \(X\) is the k-th derivative of the moment generating function (MGF) at \(t=0\). You prove this using Taylor series by writing down the Taylor series for a MGF two different ways:

By matching the corresponding coefficients of the powers of \(t\), we saw that \(E(X^k)\) must be the same as \(m_X^{(k)}(0)\).

The idea of this proof leads to the following alternative version of this theorem:

Theorem (Talyor Coefficients and Moments)

If a MGF \(m_X(t)\) has MacLaurin series \[a_0 + a_1 t + a_2 t^2 + a_3 t^3 + \ldots,\] then the moments of \(X\) are \(E(X^k) = k! a_k\).

After that, we used the definition to find the MGF of a Norm(0,1) random variable and converted that to a power series. From the power series we could use this theorem to figure out all of the moments. See the In-class exercises below (#1-3) and see if you remember how we did this.

Moment generating functions for sums of independent random variables work the same way as probability generating functions.

Theorem (Sums of Independent RVs)

If \(X\) and \(Y\) are independent random variables with MGFs \(m_X\) and \(m_Y\) respectively, then the MGF for \(X+Y\) is \[m_{X+Y}(t) = m_X(t) m_Y(t).\]

The final theorem this week is the following:

Theorem (MGFs Determine Distributions)

If two random variables have the same MGF, then they must have the same probability distribution. This theorem takes some pretty advanced analysis to prove, so we'll just accept it for now.

In-Class Exercises

  1. Find the MGF of \(Z \sim \text{Norm}(0,1)\) (it's okay to use WolframAlpha to do the integral).

  2. Use the MacLaurin series for \(e^x = 1 + x + \frac{x^2}{2!} + \frac{x^3}{3!} + \ldots\) to convert the MGF for \(Z\) into a power series.

  3. Find the first four moments of \(Z\).

  4. If \(Y \sim \text{Exp}(1)\), then we saw that the MGF for \(Y\) is \(\frac{1}{1-t}\). The MacLaurin series for this function is \(1+t+t^2+t^3+...\). What are the moments of \(Y\)?

  5. Find the MGF of a \(\text{Pois}(\lambda)\) random variable.

  6. Show that the sum of two independent Poisson random variables is Poisson.

Study Questions

  1. A sum of independent Poisson random variables is always Poisson. What other probability distributions have this property? Look at the list of MGFs for the common probability distributions and see if you can find any other distributions with this property.

Click here to go the notes from Week 2