Today we continue Section 3.5. Continue reading Section 3.5. Work through suggested exercises.
Homework 7 is on WeBWorK and is due on Friday at 11:55pm.
Math Help Centre: M-F 12:30-5:30 in PAB48/49 and online 6pm-8pm.
My next office hour is today 3:30-4:20 in MC130.
The midterm is on Saturday, November 9, 2-4pm. It will cover what we get to on Friday, November 1, probably until the end of 3.5.
Definition: A subspace of $\R^n$ is any collection $S$ of
vectors in $\R^n$ such that:
1. The zero vector $\vec 0$ is in $S$.
2. $S$ is closed under addition:
If $\vu$ and $\vv$ are in $S$, then $\vu + \vv$ is in $S$.
3. $S$ is closed under scalar multiplication:
If $\vu$ is in $S$ and $c$ is any scalar, then $c \vu$ is in $S$.
Conditions (2) and (3) together are the same as saying that $S$ is closed under linear combinations.
Example: $\R^n$ is a subspace of $\R^n$. Also, $S = \{ \vec 0 \}$ is a subspace of $\R^n$.
A line or plane through the origin in $\R^3$ is a subspace. Applet.
On the other hand, a plane not through the origin is not a subspace. It of course fails (1), but the other conditions fail as well, as shown in the applet.
Theorem 3.19: Let $\vv_1, \vv_2, \ldots, \vv_k$ be vectors in $\R^n$. Then $\span(\vv_1, \ldots, \vv_k)$ is a subspace of $\R^n$.
Theorem 3.21: Let $A$ be an $m \times n$ matrix and let $N$ be the set of solutions of the homogeneous system $A \vx = \vec 0$. Then $N$ is a subspace of $\R^n$.
Spans and null spaces are the two main sources of subspaces.
Definition: Let $A$ be an $m \times n$ matrix.
1. The row space of $A$ is the subspace $\row(A)$ of $\R^n$ spanned
by the rows of $A$.
2. The column space of $A$ is the subspace $\col(A)$ of $\R^m$ spanned
by the columns of $A$.
3. The null space of $A$ is the subspace $\null(A)$ of $\R^n$
consisting of the solutions to the system $A \vx = \vec 0$.
Example: The column space of $A = \bmat{rr} 1 & 2 \\ 3 & 4 \emat$ is $\span(\coll 1 3, \coll 2 4)$. A vector $\vb$ is a linear combination of these columns if and only if the system $A \vx = \vb$ has a solution. But since $A$ is invertible (its determinant is $4 - 6 = -2 \neq 0$), every such system has a (unique) solution. So $\col(A) = \R^2$.
The row space of $A$ is the same as the column space of $A^T$, so by a similar argument, this is all of $\R^2$ as well.
The null space of $A$ consists of the vectors $\coll x y$ such that $A \coll x y = \vec 0$. But since $A$ is invertible (or because the columns are linearly independent), this only has the trivial solution. So $\null(A) = \{ \vec 0 \}$.
Note that these arguments work for any invertible matrix $A$.
Example: The column space of $A = \bmat{rr} 1 & 2 \\ 3 & 4 \\ 5 & 6 \emat$ is the span of the two columns, which is a subspace of $\R^3$. Since the columns are linearly independent, this is a plane through the origin in $\R^3$.
The row space of $A$ is the span of the three rows. But we already saw that the span of the first two rows is $\R^2$, so the span of all three rows is also $\R^2$. So $\row(A) = \R^2$.
Again, since the columns are linearly independent, $\null(A) = \{ \vec 0 \}$.
Example (cont): For the same $A$, determine whether $\colll 2 0 1$ and $\colll 2 0 {-2}$ are in $\col(A)$.
For the first one, we need to see if the system $A \vx = \colll 2 0 1$ is consistent. So we do row reduction: \[ \bmat{rr|r} 1 & 2 & 2 \\ 3 & 4 & 0 \\ 5 & 6 & 1 \emat \lra{} \bmat{rr|r} 1 & 2 & 2 \\ 0 & -2 & -6 \\ 0 & 0 & 3 \emat \] The last row means that the system is not consistent, so the first vector is not in $\col(A)$.
For the second one: \[ \bmat{rr|r} 1 & 2 & 2 \\ 3 & 4 & 0 \\ 5 & 6 & \red{-2} \emat \lra{} \bmat{rr|r} 1 & 2 & 2 \\ 0 & -2 & -6 \\ 0 & 0 & \red{0} \emat \] Since this is in REF and we don't have "zero = nonzero", it is consistent, so the second vector is in $\col(A)$.
We will learn methods to describe the three subspaces associated to a matrix $A$. But how do we want to "describe" a subspace? That's our next topic:
Definition: A basis for a subspace $S$ of $\R^n$ is a
set of vectors $\vv_1, \ldots, \vv_k$ such that:
1. $S = \span(\vv_1, \ldots, \vv_k)$, and
2. $\vv_1, \ldots, \vv_k$ are linearly independent.
Condition (2) ensures that none of the vectors is redundant, so we aren't being wasteful. Giving a basis for a subspace is a good way to "describe" it.
Example 3.42: The standard unit vectors $\ve_1, \ldots, \ve_n$ in $\R^n$ are linearly independent and span $\R^n$, so they form a basis of $\R^n$ called the standard basis.
Example: We saw above that $\coll 1 3$ and $\coll 2 4$ span $\R^2$. They are also linearly independent, so they are a basis for $\R^2$.
Note that $\coll 1 0$ and $\coll 0 1$ are another basis for $\R^2$. A subspace will in general have many bases, but we'll see soon that they all have the same number of vectors! (Grammar: one basis, two bases.)
Example: Let $\cP$ be the plane through the origin with direction vectors $\colll 1 3 5$ and $\colll 2 4 6$. Then $\cP$ is a subspace of $\R^3$ and these two vectors are a basis for $\cP$.
Example: Find a basis for $S = \span(\colll 3 0 2, \colll {-2} 1 1, \colll 1 1 3)$.
In more complicated situations, there are two ways to find a basis of the span of a set of vectors. The first way uses the following result:
Theorem 3.20: Let $A$ and $B$ be row equivalent matrices. Then $\row(A) = \row(B)$.
(Row equivalent means that $B$ can be obtained from $A$ by doing row operations.)
Proof: Suppose $B$ is obtained from $A$ by performing elementary row operations. Each of these operations expresses the new row as a linear combination of the previous rows. So every row of $B$ is a linear combination of the rows of $A$. So $\row(B)\ \red{\subseteq}\ \row(A)$.
On the other hand, each row operation is reversible, so reversing the above argument gives that $\row(A) \subseteq \row(B)$. Therefore, $\row(A) = \row(B). \quad \Box$
This will be useful, because it is easy to understand the row space of a matrix in row echelon form.
Example: Let's redo the above example. Consider the matrix $$ A = \bmat{rrr} 3 & 0 & 2 \\ -2 & 1 & 1 \\ 1 & 1 & 3 \emat $$ whose rows are the given vectors. So $S = \row(A)$.
Row reduction produces the following matrix $$ B = \bmat{ccc} 1 & 0 & 2/3 \\ 0 & 1 & 7/3 \\ 0 & 0 & 0 \emat $$ which is in reduced row echelon form. By Theorem 3.20, $S = \row(B)$. But the first two rows clearly give a basis for $\row(B)$, so another solution to the question is $\ccolll 1 0 {2/3}$ and $\ccolll 0 1 {7/3}$.
Theorem: If $R$ is a matrix in row echelon form, then the nonzero rows of $R$ form a basis for $\row(R)$.
Example: Let $$ R = \bmat{rrrr} 1 & 2 & 3 & 4 \\ 0 & 5 & 6 & 7 \\ 0 & 0 & 0 & 8 \\ 0&0&0&0 \emat = \collll {\va_1} {\va_2} {\va_3} {\va_4} $$ $\row(R)$ is the span of the nonzero rows, since zero rows don't contribute. So we just need to see that the nonzero rows are linearly independent. If we had $c_1 \va_1 + c_2 \va_2 + c_3 \va_3 = \vec 0$, then $c_1 = 0$, by looking at the first component. So $5 c_2 = 0$, by looking at the second component. And so $8 c_3 = 0$, by looking at the fourth component. So $c_1 = c_2 = c_3 = 0$.
The same argument works in general, by looking at the pivot (leading) columns, and this proves the Theorem.
This gives rise to the row method for finding a basis for a subspace $S$ spanned by some vectors $\vv_1, \ldots, \vv_k$:
1. Form the matrix $A$ whose rows are $\vv_1, \ldots, \vv_k$, so $S = \row(A)$.
2. Reduce $A$ to row echelon form $R$.
3. The nonzero rows of $R$ will be a basis of $S = \row(A) = \row(R)$.
Notice that the vectors you get are usually different from the vectors you started with. Given $S = \span(\vv_1, \ldots, \vv_k)$, one can always find a basis for $S$ which just omits some of the given vectors. We'll explain this next.
Suppose we form a matrix $A$ whose columns are $\vv_1, \ldots, \vv_k$. A nonzero solution to the system $A \vx = \vec 0$ is exactly a dependency relationship between the given vectors. Also, recall that if $R$ is row equivalent to $A$, then $R \vx = \vec 0$ has the same solutions as $A \vx = \vec 0$. This means that the columns of $R$ have the same dependency relationships as the columns of $A$.
Example 3.47: Find a basis for the column space of $$ A = \bmat{rrrrr} 1 & 1 & 3 & 1 & 6 \\ 2 & -1 & 0 & 1 & -1 \\ -3 & 2 & 1 & -2 & 1 \\ 4 & 1 & 6 & 1 & 3 \emat $$ Solution: The reduced row echelon form is $$ R = \bmat{rrrrr} 1 & 0 & 1 & 0 & -1 \\ 0 & 1 & 2 & 0 & 3 \\ 0 & 0 & 0 & 1 & 4 \\ 0 & 0 & 0 & 0 & 0 \emat $$ Write $\vr_i$ for the columns of $R$ and $\va_i$ for the columns of $A$. You can see immediately that $\vr_3 = \vr_1 + 2 \vr_2$ and $\vr_5 = -\vr_1 + 3\vr_2 + 4\vr_4$. So $\col(R) = \span(\vr_1, \vr_2, \vr_4)$, and these three are linearly independent since they are standard unit vectors.
It follows that the columns of $A$ have the same dependency relationships: $\va_3 = \va_1 + 2 \va_2$ and $\va_5 = -\va_1 + 3\va_2 + 4\va_4$. Also, $\va_1$, $\va_2$ and $\va_4$ must be linearly independent. So a basis for $\col(A)$ is given by $\va_1$, $\va_2$ and $\va_4$.
Note that these are the columns corresponding to the leading 1's in $R$!
Warning: Elementary row operations change the column space! So $\col(A) \neq \col(R)$. So while $\vr_1$, $\vr_2$ and $\vr_4$ are a basis for $\col(R)$, they are not a solution to the question asked.
The other kind of subspace that arises a lot is the null space of a matrix $A$, the subspace of solutions to the homogeneous system $A \vx = \vec 0$. We learned in Chapter 2 how to find a basis for this subspace, even though we didn't use this terminology.
Example 3.48: Find a basis for the null space of the $4 \times 5$ matrix $A$ above.
Solution: The reduced row echelon form of $[A \mid \vec 0\,]$ is $$ [R \mid \vec 0\,] = \bmat{rrrrr|r} 1 & 0 & 1 & 0 & -1 & 0 \\ 0 & 1 & 2 & 0 & 3 & 0 \\ 0 & 0 & 0 & 1 & 4 & 0 \\ 0 & 0 & 0 & 0 & 0 & 0 \emat $$ We see that $x_3$ and $x_5$ are free variables, so we let $x_3 = s$ and $x_5 = t$ and use back substitution to find that $$ \vx = \colllll {x_1} {x_2} {x_3} {x_4} {x_5} = s \colllll {-1} {-2} 1 0 0 + t \colllll 1 {-3} 0 {-4} 1 \qqtext{(See text.)} $$ Therefore, the two column vectors shown form a basis for the null space.
The vectors that arise in this way will always be linearly independent, since if all $x_i$'s are $0$, then the free variables must be zero, so the parameters must be zero.
1. Find the reduced row echelon form $R$ of $A$.
2. The nonzero rows of $R$ form a basis for $\row(A) = \row(R)$.
3. The columns of $A$ that correspond to the columns of $R$ with leading 1's
form a basis for $\col(A)$.
4. Use back substitution to solve $R \vx = \vec 0$; the vectors that
arise are a basis for $\null(A) = \null(R)$.
You just need to do row reduction once to answer all three questions!
We have seen two ways to compute a basis of a span of a set of vectors. One is to make them the columns of a matrix, and the other is to make them the rows. The column method produces a basis using vectors from the original set. Both ways require about the same amount of work.
Similarly, if asked to find a basis for $\row(A)$, one could use the column method on $A^T$.
Proof is in next lecture.
Definition: The number of vectors in a basis for a subspace $S$ is called the dimension of $S$, denoted $\dim S$.
We say that the dimension of the zero subspace $\{ \vec 0 \}$ is $0$.