The covariance matrix of a random vector \(X = (X_1, X_2, \ldots, X_n)^T\) is the \(n\)-by-\(n\) matrix with the entry in row \(i\) and column \(j\) given by \(\operatorname{Cov}(X_i,X_j)\). Show that you get the same matrix if you calculate \(E((X-\mu_X)(X-\mu_X)^T)\) where \(\mu_X\) is the vector \(E(X)\). Hint: It is enough to explain why the entry in the \(i\)-th row and \(j\)-th column of the two matrices is the same, for any \(i\) and \(j\).
If \(X\) and \(Y\) are independent random variables, then \(\operatorname{Var}(X+Y) = \operatorname{Var}(X)+\operatorname{Var}(Y)\). What if they aren’t independent? Use the fact that \(\operatorname{Var}(X+Y) = \operatorname{Cov}(X+Y,X+Y)\) to find a formula for the variance of \(X+Y\). Update: You should show that: \[\operatorname{Var}(X+Y) = \operatorname{Var}(X) + \operatorname{Var}(Y) + 2\operatorname{Cov}(X,Y).\]
If you randomly select a married couple, the heights of the couple is a random vector \(X\) with a multivariate normal distribution where \(X_1\) is the height of the husband and \(X_2\) is the height of the wife. Suppose that \(\mu_X = \begin{bmatrix}70 \\ 65 \end{bmatrix}\) (measured in inches) and the covariance matrix is \[\Sigma_X = \begin{bmatrix} 10 & 5 \\ 5 & 5 \end{bmatrix}.\] Graph three different level sets for the joint density function: \[f_X(x_1, x_2) = \frac{1}{2\pi|\Sigma_X|^{1/2}} \exp\left(-\frac{1}{2}(x-\mu_X)^T \Sigma_X^{-1} (x-\mu_X) \right).\] What shape are each of the level sets?
Suppose that \(X\) is a random vector in \(\mathbb{R}^n\) with a multivariate normal distribution with joint density function \[f_X(x) = \frac{1}{(2\pi)^{n/2}|\Sigma_X|^{1/2}} \exp\left(-\frac{1}{2}(x-\mu_X)^T \Sigma_X^{-1} (x-\mu_X) \right).\] Use the change of variables formula \(f_Y(y) = f_X(x) \left| \frac{\partial X}{\partial Y} \right|\) to show that if \(A \in \mathbb{R}^{n \times n}\) is an invertible matrix, then \(Y = AX\) has a MVN distribution. Be sure to express the joint density function \(f_Y(y)\) as a function of \(y\) and verify that it is a MVN distribution. Recall that the Jacobian matrix \(\frac{\partial Y}{\partial X}\) for a linear transformation \(A\) is just \(A\) and that the \(\det(A^{-1}) = \frac{1}{\det(A)}\).
Suppose that \(X_1, X_2,\) and \(X_3\) are i.i.d. normal random variables with mean \(\mu\) and variance \(\sigma^2\). Find the covariance matrix and joint density function for \(Y_1\), \(Y_2\), and \(Y_3\) given by: \[\begin{bmatrix} Y_1 \\ Y_2 \\ Y_3 \end{bmatrix} = \begin{bmatrix} 1 & -1 & 0 \\ 1 & 1 & -2 \\ 1 & 1 & 1 \end{bmatrix} \begin{bmatrix} X_1 \\ X_2 \\ X_3 \end{bmatrix}.\]
An orthogonal matrix is an invertible matrix \(U \in \mathbb{R}^{n \times n}\) such that \(U^T = U^{-1}\). Suppose that \(X\) is a random vector in \(\mathbb{R}^{n}\) with the standard MVN distribution (i.e., each \(X_i\) is i.i.d \(\operatorname{Norm}(0,1)\)). Show that the joint density function for \(Y = UX\) is \[f_Y(y) = \frac{1}{(2\pi)^{n/2}} \exp \left( - \frac{1}{2} \|y\|^2 \right).\]