This week we will focus on transformations of random variables.
Wednesday, January 29
We started today with the following warm-up problem:
Most programming languages have a built in random number generator
function that will generate (psuedo)random numbers between 0 and 1. How
could you use a random number generator like that to generate a number
between 50 and 100?
If
is a
random variable, then
will be a
random variable. What about other transformations of random variables?
For situations like that, we have the following change of variables
theorem.
Theorem (Change of Variables)
If
is a continuous random variable with density
and
where
is a strictly increasing (or decreasing) differentiable function, then
The support of
is the interval from
to
where
is the support of
.
According to the Change of Variables Theorem, what is the density
function for
?
Suppose
where
.
What is the probability distribution of
?
What is the density function for a linear tranformation
of an
r.v.
?
What is the support of the density function?
So why does this work?
Proof of Change of Variables Theorem
Let’s assume
is a strictly increasing function. (Showing that the theorem is true
when
is strictly decreasing is Homework 2, Problem 4).
Strictly increasing means that
iff
.
To find the density function for
,
first we’ll write down the CDF for
:
$$\begin{align*}F_Y(y) &= P(Y \le y)
\hspace{0.85in} \text{(definition of CDF)} \\
&= P(g(X) \le y) ~~~~ ~~~~ ~~~ \text{(substitution)} \\
&= P(X \le g^{-1}(y)) \hspace{0.4in} \text{(since }g\text{ is
strictly increasing)} \\
&= F_X(g^{-1}(y)). \end{align*}$$ To get the PDF from the
CDF, take the derivative of both sides with respect to
.
Then by the chain rule, this is:
Since
is increasing, so is
and therefore
is nonnegative.
Study Question
We saw in class today that if you randomly generate a number
uniformly between 0 and 1, then
will have an
distribution. What if you wanted a random number with an
distribution where
?
How could you generate a random number with that distribution?
Friday, January 31
Today we covered multivariate transformations of random variables.
But first, we looked at an example where the Change of Basis theorem
didn’t apply directly, but we were able to use the idea of the proof to
make things work.
Let
where
.
Why doesn’t the change of basis theorem apply to
and
?
Write the CDF for
()
in terms of the CDF for
.
Differentiate both sides with respect to
to find the density function
.
Once we worked out that example, we moved on to the main result of
today:
Theorem (Multivariate Change of Variables)
Suppose
is a continuous random vector with joint density
and
where
is function that satisfies the following assumptions:
is invertible.
All partial derivative
of
exist and are continuous, and
The Jacobian matrix
has nonzero determinant.
Then
where
is the absolute value of the determinant of the Jacobian matrix.
Example: Box-Muller
Transformation
Suppose that
and
are independent random variables. Let
Show that
.
Find a formula for
(Hint: what is
?).
Find the Jacobian matrix
and show that the determinant is always 1.
Use the Multivariate Change of Variables Theorem to show that the
joint density for
is
Observe that
and
are independent
random variables since
factors into