Today we finish 5.3 and start 5.4. Read Section 5.4 for Wednesday. Work through recommended homework questions.
Tutorials: This week: review.
Office hour: Monday, 1:30-2:30, MC103B.
Help Centers: Monday-Friday 2:30-6:30 in MC 106.
Review Session: Friday in class; bring questions.
Final exam: Covers whole course, with an emphasis on the material in Chapters 4 and 5 (after the midterm). Our course will end with Section 5.4.
Definition: Let $W$ be a subspace of $\R^n$. A vector $\vv$ is orthogonal to $W$ if $\vv$ is orthogonal to every vector in $W$. The orthogonal complement of $W$ is the set of all vectors orthogonal to $W$ and is denoted $W^\perp$. So $$ \kern-4ex W^\perp = \{ \vv \in \R^n : \vv \cdot \vw = 0 \text{ for all } \vw \text{ in } W \} $$
Definition: Let $W$ be a subspace of $\R^n$ and let $\{ \vu_1, \ldots, \vu_k \}$ be an orthogonal basis for $W$. For $\vv$ in $\R^n$, the orthogonal projection of $\vv$ onto $W$ is the vector $$ \proj_W(\vv) = \proj_{\vu_1}(\vv) + \cdots + \proj_{\vu_k}(\vv) $$ The component of $\vv$ orthogonal to $W$ is the vector $$ \Perp_W(\vv) = \vv - \proj_W(\vv) $$
We showed that $\proj_W(\vv)$ is in $W$ and $\Perp_W(\vv)$ is in $W^\perp$.
Here and in the rest of Section 5.2, we assume that every subspace has at least one orthogonal basis.
Theorem 5.11: Let $W$ be a subspace of $\R^n$ and let $\vv$ be a vector in $\R^n$. Then there are unique vectors $\vw$ in $W$ and $\vw^\perp$ in $W^\perp$ such that $\vv = \vw + \vw^\perp$.
Theorem 5.13: If $W$ is a subspace of $\R^n$, then $$ \dim W + \dim W^\perp = n $$
The Rank Theorem follows if we take $W = \row(A)$, since then $W^\perp = \null(A)$.
Theorem 5.15 (The Gram-Schmidt Process): Let $\{ \vx_1, \ldots, \vx_k \}$ be a basis for a subspace $W$ of $\R^n$. Write $W_1 = \span(\vx_1)$, $W_2 = \span(\vx_1, \vx_2)$, $\ldots$, $W_k = \span(\vx_1, \ldots, \vx_k)$. Define: $$ \kern-9ex \begin{aligned} \vv_1 &= \vx_1 \\ \vv_2 &= \Perp_{W_1}(\vx_2) = \vx_2 - \frac{\vv_1 \cdot \vx_2}{\vv_1 \cdot \vv_1} \vv_1 \\ \vv_3 &= \Perp_{W_2}(\vx_3) = \vx_3 - \frac{\vv_1 \cdot \vx_3}{\vv_1 \cdot \vv_1} \vv_1 - \frac{\vv_2 \cdot \vx_3}{\vv_2 \cdot \vv_2} \vv_2 \\ &\vdots \\ \vv_k &= \Perp_{W_{k-1}}(\vx_k) = \vx_k - \frac{\vv_1 \cdot \vx_k}{\vv_1 \cdot \vv_1} \vv_1 - \cdots - \frac{\vv_{k-1} \cdot \vx_k}{\vv_{k-1} \cdot \vv_{k-1}} \vv_{k-1} \end{aligned} $$ Then for each $i$, $\{ \vv_1, \ldots, \vv_i \}$ is an orthogonal basis for $W_i$. In particular, $\{ \vv_1, \ldots, \vv_k \}$ is an orthogonal basis for $W = W_k$.
Notes: To compute $\Perp_{W_i}$ you have to use the orthogonal basis of $\vv_j$'s that you have constructed already, not the original basis of $\vx_j$'s.
The basis you get depends on the order of the vectors you start with. You should always do a question using the vectors in the order given, since that order will be chosen to minimize the arithmetic.
If you are asked to find an orthonormal basis, normalize each $\vv_j$ at the end. (It is correct to normalize earlier, but can be messier.)
Example 5.13: Apply Gram-Schmidt to construct an orthogonal basis for the subspace $W = \span(\vx_1, \vx_2, \vx_3)$ of $\R^4$ where $$ \vx_1 = \collll 1 {-1} {-1} 1, \quad \vx_2 = \collll 2 {1} {0} 1, \quad \vx_3 = \collll 2 {2} {1} 2 $$ On whiteboard, scaling intermediate results. We get $$ \vv_1 = \collll 1 {-1} {-1} 1, \quad \vv'_2 = \collll 3 3 1 1, \quad \vv'_3 = \collll {-1} 0 1 2 $$ If we want an orthonormal basis, we scale these: $$ \kern-9ex \vq_1 = {\textstyle\frac{1}{\|\vv_1\|}} \! \vv_1 = \frac{1}{2} \! \collll 1 {\!-1} {\!-1} 1, \ \vq_2 = {\textstyle\frac{1}{\|\vv'_2\|}}\! \vv'_2 = \frac{1}{\sqrt{20}}\! \collll 3 3 1 1, \ \vq_3 = {\textstyle\frac{1}{\|\vv'_3\|}}\! \vv'_3 = \frac{1}{\sqrt{6}} \! \collll {\!-1} 0 1 2 $$
Note: You don't need to check that the starting vectors are linearly independent. If they are dependent, then one or more of the $\vv_j$'s will be $\vec 0$, and you can just ignore it.
Example: Is $\vw = \collll 8 {-2} 2 0$ in $W$?
Solution: We can compute that $\proj_W(\vw) = 2 \vv_1 + \vv'_2 - \vv'_3 = \collll 6 1 {-2} 1$, so the answer is no.
Example: Compute the projection of $\vw = \collll 8 {-2} 2 0$ onto $\span(\vx_1, \vx_2)$.
Solution: We use that $\span(\vx_1, \vx_2) = \span(\vv_1, \vv'_2)$. So, by the work done for the previous example, we get $2 \vv_1 + \vv'_2$. (Do not try to work directly with $\vx_1$ and $\vx_2$.)
Example 5.14: Find an orthogonal basis for $\R^3$ that contains the vector $\vv_1 = \colll 1 2 3$.
Solution: Choose any two vectors $\vx_2$ and $\vx_3$ so that $\{ \vv_1, \vx_2, \vx_3 \}$ is a basis for $\R^3$. For example, you can take $$ \vx_2 = \colll 0 1 0 \qtext{and} \vx_3 = \colll 0 0 1 $$ Then apply Gram-Schmidt, using the vectors in that order, so $\vv_1$ doesn't change. (Details in text.)
Theorem 5.16: Let $A$ be an $m \times n$ matrix with linearly independent columns. Then $A$ can be factored as $A = QR$, where $Q$ is an $m \times n$ matrix with orthonormal columns and $R$ is an invertible upper triangular $n \times n$ matrix.
Note that we must have $m \geq n$. (Why?)
Explanation: Write $\va_1, \ldots, \va_n$ for the linearly independent columns of $A$. Apply Gram-Schmidt to produce orthonormal vectors $\vq_1, \ldots, \vq_n$ with $\span(\va_1, \ldots, \va_i) = \span(\vq_1, \ldots, \vq_i)$ for each $i$. Therefore we can find scalars $r_{ij}$ such that: $$ \begin{aligned} \va_1 &= r_{11} \vq_1 \\ \va_2 &= r_{12} \vq_1 + r_{22} \vq_2 \\ &\vdots \\ \va_n &= r_{1n} \vq_1 + r_{2n} \vq_2 + \cdots + r_{nn} \vq_n \\ \end{aligned} $$ That is, $$ \kern-8ex A = [\, \va_1 \cdots \va_n \,] = [\, \vq_1 \cdots \vq_n \,] \bmat{cccc} r_{11} & r_{12} & \cdots & r_{1n} \\ 0 & r_{22} & \cdots & r_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & r_{nn} \emat = QR $$ One can also see that the diagonal entries $r_{ii}$ are non-zero. (Explain.) Therefore, $\det R \neq 0$ and $R$ is invertible.
Note that $r_{ij} = \vq_i \cdot \va_j$.
Example 5.15: Find a QR factorization of $A = \bmat{rrr} 1 & 2 & 2 \\ -1 & 1 & 2 \\ -1 & 0 & 1 \\ 1 & 1 & 2 \emat$.
Solution: The columns of $A$ are the vectors from Example 5.13, so we get the matrix $$ Q = \bmat{ccc} \ph 1/2 & 3/\sqrt{20} & -1/\sqrt{6} \\ -1/2 & 3/\sqrt{20} & 0 \\ -1/2 & 1/\sqrt{20} & 1/\sqrt{6} \\ \ph 1/2 & 1/\sqrt{20} & 2/\sqrt{6} \\ \emat $$ We want to find $R$ such that $A = QR$. Since the columns of $Q$ are orthonormal, we have $Q^T Q = I$. So $Q^T A = Q^T Q R = R$ and one can compute $R$ by matrix multiplication to find $$ R = Q^T A = \cdots = \bmat{ccc} 2 & 1 & 1/2 \\ 0 & \sqrt{5} & 3 \sqrt{5}/2 \\ 0 & 0 & \sqrt{6}/2 \emat $$ (See text for details.) Note that you can save some work, since you know that the entries below the diagonal must be zero.
Also note that this matrix multiplication is exactly working out the components of $\va_i$ with respect to the orthonormal basis of $\vq_j$'s, using that $r_{ij} = \vq_i \cdot \va_j$.
Recall that a square matrix $A$ is symmetric if $A^T = A$.
Examples: $\bmat{rr} 1 & 2 \\ 2 & 3 \emat$, $\bmat{rr} 3 & 2 \\ 2 & 3 \emat$, $\bmat{rr} 1 & 0 \\ 0 & 3 \emat$, $\bmat{rrr} 1 & 2 & 3 \\ 2 & 4 & 5 \\ 3 & 5 & 6 \emat$.
Non-examples: $\bmat{rr} 3 & -2 \\ 2 & 3 \emat$, $\bmat{rrr} 1 & 2 & 3 \\ 5 & 4 & 5 \\ 3 & 2 & 6 \emat$.
Definition: A square matrix $A$ is orthogonally diagonalizable if there exists an orthogonal matrix $Q$ such that $Q^T A Q$ is a diagonal matrix $D$.