Today we finish 5.2 and start 5.3. Read Sections 5.3 and 5.4 for next class. Work through recommended homework questions.
Tutorials: Review and questions.
Office hour: Mon 1:30-2:30 and Wed 10:30-11:15, MC103B.
Help Centers: Monday-Friday 2:30-6:30 in MC 106.
Final exam: Covers whole course, with an emphasis on the material after the midterm. Our course will end with Section 5.4.
Question: If $W = \R^n$, then $W^\perp = \query{\{ \vec 0 \}}$
T/F: An orthogonal basis $\{ \vv_1, \ldots, \vv_k \}$ must have $$ \vv_i \cdot \vv_j = \begin{cases} 0 & \text{if } i \neq j \\ 1 & \text{if } i = j \end{cases} $$
Definition: Let $W$ be a subspace of $\R^n$. A vector $\vv$ is orthogonal to $W$ if $\vv$ is orthogonal to every vector in $W$. The orthogonal complement of $W$ is the set of all vectors orthogonal to $W$ and is denoted $W^\perp$. So $$ \kern-4ex W^\perp = \{ \vv \in \R^n : \vv \cdot \vw = 0 \text{ for all } \vw \text{ in } W \} $$
An example to keep in mind is where $W$ is a plane through the origin in $\R^3$ and $W^\perp$ is $\span(\vn)$, where $\vn$ is the normal vector to $W$.
Theorem 5.9: Let $W$ be a subspace of $\R^n$. Then:
a. $W^\perp$ is a subspace of $\R^n$.
b. $(W^\perp)^\perp = W$
c. $W \cap W^\perp = \{ \vec 0 \}$
d. If $W = \span(\vw_1, \ldots, \vw_k)$, then $\vv$ is in $W^\perp$
if and only if $\vv \cdot \vw_i = 0$ for all $i$.
We proved all of these except part (b), which will come today.
Theorem 5.10: Let $A$ be an $m \times n$ matrix. Then $$ \kern-4ex (\row(A))^\perp = \null(A) \qtext{and} (\col(A))^\perp = \null(A^T) $$
The first two are in $\R^n$ and the last two are in $\R^m$. These are the four fundamental subspaces of $A$.
If we write $W = \span(\vu)$, then $\vw = \proj_{\vu}(\vv)$ is in $W$, $\vw^\perp = \Perp_{\vu}(\vv)$ is in $W^\perp$, and $\vv = \vw + \vw^\perp$. We can do this more generally:
Definition: Let $W$ be a subspace of $\R^n$ and let $\{ \vu_1, \ldots, \vu_k \}$ be an orthogonal basis for $W$. For $\vv$ in $\R^n$, the orthogonal projection of $\vv$ onto $W$ is the vector $$ \kern-8ex \proj_W(\vv) = \proj_{\vu_1}(\vv) + \cdots + \proj_{\vu_k}(\vv) $$ The component of $\vv$ orthogonal to $W$ is the vector $$ \kern-8ex \Perp_W(\vv) = \vv - \proj_W(\vv) $$
We will show soon that $\Perp_W(\vv)$ is in $W^\perp$.
Note that multiplying $\vu$ by a scalar in the earlier example doesn't change $W$, $\vw$ or $\vw^\perp$. We'll see later that the general definition also doesn't depend on the choice of orthogonal basis.
Theorem: $\Perp_W(\vv)$ is in $W^\perp$.
Explain on board.
Now we will see that $\proj$ and $\Perp$ don't depend on the choice of orthogonal basis. Here and in the rest of the section, we assume that every subspace has at least one orthogonal basis.
Theorem 5.11: Let $W$ be a subspace of $\R^n$ and let $\vv$ be a vector in $\R^n$. Then there are unique vectors $\vw$ in $W$ and $\vw^\perp$ in $W^\perp$ such that $\vv = \vw + \vw^\perp$.
Proof: We saw above that such a decomposition exists, by taking $\vw = \proj_W(\vv)$ and $\vw^\perp = \Perp_W(\vv)$, using an orthogonal basis for $W$.
We now show that this decomposition is unique. So suppose $\vv = \vw_1 + \vw_1^\perp$ is another such decomposition. Then $\vw + \vw^\perp = \vw_1 + \vw_1^\perp$, so $$\vw - \vw_1 = \vw_1^\perp - \vw^\perp$$ The left hand side is in $W$ and the right hand side is in $W^\perp$ (why?), so both sides must be zero (why?). So $\vw = \vw_1$ and $\vw^\perp = \vw_1^\perp$.$\quad\Box$
Note that $\perp$ is an operation on subspaces, but is not an operation on vectors.
Now we can prove part (b) of Theorem 5.9.
Corollary 5.12: If $W$ is a subspace of $\R^n$, then $(W^\perp)^\perp = W$.
Proof: If $\vw$ is in $W$ and $\vx$ is in $W^\perp$, then $\vw \cdot \vx = 0$. This means that $\vw$ is in $(W^\perp)^\perp$. So $W \subseteq (W^\perp)^\perp$.
We need to show that every vector in $(W^\perp)^\perp$ is in $W$. So let $\vv$ be a vector in $(W^\perp)^\perp$. By the previous result, we can write $\vv$ as $\vw + \vw^\perp$, where $\vw$ is in $W$ and $\vw^\perp$ is in $W^\perp$. Then $$ \kern-4ex \begin{aligned} 0 &= \vv \cdot \vw^\perp = (\vw + \vw^\perp)\cdot\vw^\perp \\ &= \vw \cdot \vw^\perp + \vw^\perp \cdot \vw^\perp = 0 + \vw^\perp \cdot \vw^\perp = \vw^\perp \cdot \vw^\perp \end{aligned} $$ So $\vw^\perp = \vec 0$ and $\vv = \vw$ is in $W$.$\quad\Box$
This next result is related to the Rank Theorem:
Theorem 5.13: If $W$ is a subspace of $\R^n$, then $$ \dim W + \dim W^\perp = n $$
Proof: Let $\{ \vu_1, \ldots, \vu_k \}$ be an orthogonal basis of $W$ and let $\{ \vv_1, \ldots, \vv_\ell \}$ be an orthogonal basis of $W^\perp$. Then $\{ \vu_1, \ldots, \vu_k, \vv_1, \ldots, \vv_\ell \}$ is an orthogonal basis for $\R^n$. (Explain.) The result follows.$\quad\Box$
Example: For $W$ a plane in $\R^3$, $2 + 1 = 3$.
The Rank Theorem follows if we take $W = \row(A)$, since then $W^\perp = \null(A)$:
Corollary 5.14 (The Rank Theorem, again): If $A$ is an $m \times n$ matrix, then $$ \rank(A) + \nullity(A) = n $$
Note: The logic here can be reversed. We can use the rank theorem to prove Theorem 5.13, and Theorem 5.13 can be used to prove Corollary 5.12.
Example: Let $W = \span(\vx_1, \vx_2)$ where $\vx_1 = \colll 1 1 0$ and $\vx_2 = \colll {-2} 0 1$. Find an orthogonal basis for $W$.
Solution: Ideas? Do on board.
Question: What if we had a third basis vector $\vx_3$?
Theorem 5.15 (The Gram-Schmidt Process): Let $\{ \vx_1, \ldots, \vx_k \}$ be a basis for a subspace $W$ of $\R^n$. Write $W_1 = \span(\vx_1)$, $W_2 = \span(\vx_1, \vx_2)$, $\ldots$, $W_k = \span(\vx_1, \ldots, \vx_k)$. Define: $$ \kern-9ex \begin{aligned} \vv_1 &= \vx_1 \\ \vv_2 &= \Perp_{W_1}(\vx_2) = \vx_2 - \frac{\vv_1 \cdot \vx_2}{\vv_1 \cdot \vv_1} \vv_1 \\ \vv_3 &= \Perp_{W_2}(\vx_3) = \vx_3 - \frac{\vv_1 \cdot \vx_3}{\vv_1 \cdot \vv_1} \vv_1 - \frac{\vv_2 \cdot \vx_3}{\vv_2 \cdot \vv_2} \vv_2 \\ &\vdots \\ \vv_k &= \Perp_{W_{k-1}}(\vx_k) = \vx_k - \frac{\vv_1 \cdot \vx_k}{\vv_1 \cdot \vv_1} \vv_1 - \cdots - \frac{\vv_{k-1} \cdot \vx_k}{\vv_{k-1} \cdot \vv_{k-1}} \vv_{k-1} \end{aligned} $$ Then for each $i$, $\{ \vv_1, \ldots, \vv_i \}$ is an orthogonal basis for $W_i$. In particular, $\{ \vv_1, \ldots, \vv_k \}$ is an orthogonal basis for $W = W_k$.
Explain verbally.
Notes: To compute $\Perp_{W_i}$ you have to use the orthogonal basis of $\vv_j$'s that you have constructed already, not the original basis of $\vx_j$'s.
The basis you get depends on the order of the vectors you start with. You should always do a question using the vectors in the order given, since that order will be chosen to minimize the arithmetic.
If you are asked to find an orthonormal basis, normalize each $\vv_j$ at the end. (It is correct to normalize earlier, but can be messier.)