Today we finish 5.1 and start 5.2. Continue reading Section 5.2 for next class, and start reading 5.3. Work through recommended homework questions.
Tutorials: Quiz 9 covers 4.6, 5.1 and the first part of 5.2 (orthogonal complements).
Office hour: Mon 3:00-3:30, MC103B.
Help Centers: Monday-Friday 2:30-6:30 in MC 106.
T/F: A matrix with orthogonal columns is called an orthogonal matrix.
T/F: An orthogonal matrix must be square.
Questions: Why are orthonormal bases great? Are orthogonal bases great too?
Definition: A set of vectors $\{ \vv_1, \vv_2, \ldots, \vv_k \}$ in $\R^n$ is an orthogonal set if $\vv_i \cdot \vv_j = 0$ for $i \neq j$. If it an orthonormal set if in addition $\vv_i \cdot \vv_i = 1$ for each $i$, i.e., each vector is a unit vector.
Theorem 5.1: An orthogonal set of nonzero vectors is always linearly independent.
Definition: An orthogonal basis for a subspace $W$ of $\R^n$ is a basis of $W$ that is an orthogonal set. An orthonormal basis is a basis that is an orthonormal set.
You only need to check that the set spans $W$, since it is automatically linearly independent.
Note: An orthogonal basis can be converted to an orthonormal basis by dividing each vector by its length. We'll show in Section 5.3 that every subspace has an orthogonal basis.
Recall that if $\{ \vv_1, \vv_2, \ldots, \vv_k \}$ is any basis of a subspace $W$, then any $\vw$ in $W$ can be written uniquely as a linearly combination of the vectors in the basis. In general, finding the coefficients involves solving a linear system. For an orthogonal basis, it is much easier:
Theorems 5.2/5.3: If $\{ \vv_1, \vv_2, \ldots, \vv_k \}$ is an orthogonal basis of a subspace $W$, and $\vw$ is in $W$, then $$ \vw = c_1 \vv_1 + \cdots + c_k \vv_k \qtext{where} c_i = \frac{\vw \cdot \vv_i}{\vv_i \cdot \vv_i} $$ If the basis is orthonormal, then $$ c_i = \vw \cdot \vv_i $$
Definition: A square matrix $Q$ with real entries whose columns form an orthonormal set is called an orthogonal matrix!
Note: In $\R^2$ and $\R^3$, orthogonal matrices correspond exactly to the rotations and reflections. This is an important geometric reason to study them. Another reason is that we will see in Section 5.4 that they are related to diagonalization of symmetric matrices.
Theorems 5.4 and 5.5: $Q$ is orthogonal if and only if $Q^T Q = I$, i.e. if and only if $Q$ is invertible and $Q^{-1} = Q^T$.
Examples: $A = \bmat{rrr} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 1 & 0 & 0 \emat$, $B = \bmat{rr} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \emat$ and $C = \bmat{rr} 0 & 1 \\ 1 & 0 \\ \emat$.
Theorem 5.7: If $Q$ is orthogonal, then its rows form an orthonormal set too.
Another way to put it is that $Q^T$ is also an orthogonal matrix.
Theorem 5.6: Let $Q$ be an $n \times n$ matrix. Then the following
statements are equivalent:
a. $Q$ is orthogonal.
b. $\|Q \vx\| = \|\vx\|$ for every $\vx$ in $\R^n$.
c. $Q\vx \cdot Q\vy = \vx \cdot \vy$ for every $\vx$ and $\vy$ in $\R^n$.
Example: Compute the eigenvalues and determinants of $C$ and $A$ on the board.
Theorem 5.8: Let $Q$ be an orthogonal matrix. Then:
a. $Q^{-1}$ is orthogonal.
b. $\det Q = \pm 1$
c. If $\lambda$ is a real or complex eigenvalue of $Q$, then $|\lambda| = 1$.
d. If $Q_1$ and $Q_2$ are orthogonal matrices of the same size,
then $Q_1 Q_2$ is orthogonal.
Proof:
(a) is Theorem 5.7, since $Q^{-1} = Q^T$.
(b): Since $I = Q^T Q$, we have $$1 = \det I = \det(Q^T Q) = \det(Q^T) \det(Q) = \det(Q)^2.$$ Therefore $\det(Q) = \pm 1$.
(c) If $Q\vv = \lambda \vv$, then $$ \|\vv\| = \|Q\vv\| = \|\lambda \vv\| = |\lambda| \|\vv\| $$ so $|\lambda| = 1$, since $\|\vv\| \neq 0$.
(d) Exercise, using properties of transpose.$\quad\Box$
Definition: Let $W$ be a subspace of $\R^n$. A vector $\vv$ is orthogonal to $W$ if $\vv$ is orthogonal to every vector in $W$. The orthogonal complement of $W$ is the set of all vectors orthogonal to $W$ and is denoted $W^\perp$. So $$ \kern-4ex W^\perp = \{ \vv \in \R^n : \vv \cdot \vw = 0 \text{ for all } \vw \text{ in } W \} $$
In the example above, if we write $\ell = \span(\vn)$ for the line perpendicular to $W$, then $\ell = W^\perp$ and $W = \ell^\perp$.
Theorem 5.9: Let $W$ be a subspace of $\R^n$. Then:
a. $W^\perp$ is a subspace of $\R^n$.
b. $(W^\perp)^\perp = W$
c. $W \cap W^\perp = \{ \vec 0 \}$
d. If $W = \span(\vw_1, \ldots, \vw_k)$, then $\vv$ is in $W^\perp$
if and only if $\vv \cdot \vw_i = 0$ for all $i$.
Explain (a), (c), (d) on board. (b) will be Corollary 5.12.
Theorem 5.10: Let $A$ be an $m \times n$ matrix. Then $$ \kern-4ex (\row(A))^\perp = \null(A) \qtext{and} (\col(A))^\perp = \null(A^T) $$
The first two are in $\R^n$ and the last two are in $\R^m$. These are the four fundamental subspaces of $A$.
Let's see why $(\row(A))^\perp = \null(A)$. A vector is in $\null(A)$ exactly when it is orthogonal to the rows of $A$. But the rows of $A$ span $\row(A)$, so the vectors in $\null(A)$ are exactly those which are orthogonal to $\row(A)$, by 5.9(d).
The fact that $(\col(A))^\perp = \null(A^T)$ follows by replacing $A$ with $A^T$.
Example: Let $W$ be the subspace spanned by $\vv_1 = [ 1, 2, 3 ]$ and $\vv_2 = [ 2, 5, 7 ]$. Find a basis for $W^\perp$.
Solution: Let $A$ be the matrix with $\vv_1$ and $\vv_2$ as rows. Then $W = \row(A)$, so $W^\perp = \null(A)$. Continue on board.
Exercise: Let $W$ be the subspace spanned by $\vv_1 = [ 1, 2, 3 ]$. Find a basis for $W^\perp$.
Could use the same method. Or could just find two linearly independent vectors perpendicular to $\vv_1$ by inspection.
We didn't name it then, but we also noticed that $\vv - \proj_{\vu}(\vv)$ is orthogonal to $\vu$. Let's call this $\Perp_{\vu}(\vv)$.
So if we write $W = \span(\vu)$, then $\vw = \proj_{\vu}(\vv)$ is in $W$, $\vw^\perp = \Perp_{\vu}(\vv)$ is in $W^\perp$, and $\vv = \vw + \vw^\perp$. We can do this more generally:
We will show soon that $\Perp_W(\vv)$ is in $W^\perp$.
Note that multiplying $\vu$ by a scalar in the earlier example doesn't change $W$, $\vw$ or $\vw^\perp$. We'll see later that the general definition also doesn't depend on the choice of orthogonal basis.
Example: Let $W = \span(\vu_1, \vu_2)$, where $\vu_1 = \collll 1 1 0 0$ and $\vu_2 = \collll 0 0 1 1$. Compute $\proj_W(\vv)$ and $\Perp_W(\vv)$, where $\vv = \collll 1 3 2 4$. On board.
Notice that $\Perp_W(\vv)$ is in $W^\perp$.