\[ \newcommand{\on}{\operatorname} \newcommand{\ds}{\displaystyle} \newcommand{\R}{\mathbb{R}} \]
Suppose that \(X_1 \sim \on{Norm}(0,2)\) and \(X_2 \sim \on{Norm}(0,3)\). If \(\on{Cov}(X_1,X_2) = -4\), then what is the covariance matrix for the random vector \(X = \begin{bmatrix} X_1 \\ X_2 \end{bmatrix}\)?
What is the joint density function for the random vector \(X\) above?
Which of the following possible values for the random vector \(X\) above are is a more likely outcome? \(X = \begin{bmatrix} 1 \\ -1 \end{bmatrix}\) or \(X = \begin{bmatrix} 1 \\ 1 \end{bmatrix}\)?
Suppose that \(Y = \begin{bmatrix} 1 & 1 \\ 0 & 1 \\ 0 & 1 \end{bmatrix} X\) where \(X\) is the random vector above. What is the covariance matrix for \(Y\)?
The matrix equation \[\begin{bmatrix} 1 & -2 \\ 1 & -1 \\ 1 & 1 \\ 1 & 2 \end{bmatrix} \begin{bmatrix} b_0 \\ b_1 \end{bmatrix} = \begin{bmatrix} 5 \\ 6 \\ 0 \\ 1 \end{bmatrix}\] is impossible to solve. Find the least squares solution.
The least squares solution from the last problem corresponds to a regression line for a collection of points in \(\R^2\). Draw a graph showing the regression line and the points it approximates.
What are the residuals for the regression line above?
Suppose that \(X b = y\) is a linear equation with no solutions. Assume that the columns of \(X\) are linearly independent.
For \(X = \begin{bmatrix} 1 & -2 \\ 1 & -1 \\ 1 & 1 \\ 1 & 2 \end{bmatrix}\), what are the domain and codomain of the linear transformation defined by \(X\)?
What are the dimensions of the four fundamental subspaces of \(X\)?