This week we’ll continue to look at least squares regression using
linear algebra.
Monday, February 17
Last week, we introduced the four fundamental subspaces of a matrix.
We ended with the following exercise that we didn’t have time to
finish:
(Exercise) Find the range and null space for the
3-by-3 matrix:
(Exercise) Why is every column of
in
?
(Exercise) Prove that
.
Definition (Complementary Subspaces)
Two subspaces
are called complementary if
and
spans
.
An equivalent condition is that
while
.
Definition (Orthogonal Complement)
The orthogonal complement of a set
is the subspace
Note that two subspaces
are orthogonal complements if and only if they are orthogonal and
complementary. In particular, if
,
then
and
as long as
is a subspace.
Theorem (The Fundamental Theorem of Linear Algebra)
For any matrix
,
.
.
In other words, the row space and null space of
are orthogonal complements in the domain of
which is
and the column space and left null space of
are orthogonal complements of
in the codomain of
which is
.
Furthermore, the dimensions of
and
are both the same and both are the rank of
.
Applications
Application 1. In linear regression, we needed to
find the vector
that is closest to
.
Geometrically, this is the same as requiring
to be orthogonal to
.
What is the set of vectors orthogonal to
?
Use the Fundamental Theorem of Linear Algebra to show that
is orthogonal to
if and only if
.
If
,
then
can be rewritten
which is the normal equation for least squares regression.
Application 2. To solve the normal equations for
,
it helps if
is an invertible matrix. It turns out that
is invertible if and only if the columns of
are linearly independent.
For any
,
show that
.
Hint: To prove that
,
it helps to prove that if
,
then
which means that
.
We ran out of time before we could answer these last two questions,
but they are good linear algebra review questions:
Why are the columns of
linearly independent if and only if
?
Why is
invertible if and only if
?
Hint: When is
onto? When is it 1-to-1?