Read Section 2.4 for next class, but just the parts on network flow and electrical networks. We aren't covering the rest. Work through recommended homework questions.
Quiz 3 is this week, and will cover the material until the end of Section 2.3.
Office hour: today, 3:00-3:30, MC103B.
Help Centers: Monday-Friday 2:30-6:30 in MC 106.
Definition: A vector $\vv$ is a linear combination of vectors $\vv_1, \vv_2, \ldots, \vv_k$ if there exist scalars $c_1, c_2, \ldots, c_k$ (called coefficients) such that $$ c_1 \vv_1 + \cdots + c_k \vv_k = \vv . $$
Example: Is $\colll 4 8 6$ a linear combination of $\colll 4 5 6$ and $\colll 2 1 3$?
That is, can we find scalars $x$ and $y$ such that $$x \colll 4 5 6 + y \colll 2 1 3 = \colll 4 8 6 ?$$
Expanding this into components, this becomes a linear system $$ \kern-7ex \begin{aligned} 4 x + 2 y\ &= 4 \\ 5 x + \ph y\ &= 8 \\ 6 x + 3 y\ &= 6 \end{aligned} \quad\text{with augmented matrix}\quad \bmat{rr|r} 4 & 2 & 4 \\ 5 & 1 & 8 \\ 6 & 3 & 6 \emat $$ and we already know how to determine whether this system is consistent: use row reduction!
Theorem 2.4: A system with augmented matrix $[A \mid \vb \,]$ is consistent if and only if $\vb$ is a linear combination of the columns of $A$.
This gives a different geometrical way to understand the solutions to a system.
If $\span(S) = \R^n$, then $S$ is called a spanning set for $\R^n$.
Example: $\span(\ve_1, \ve_2, \ldots, \ve_n) = \R^n$.
Example: The span of $\vu = \colll 1 2 3$ and $\vv = \colll 4 5 6$ consists of every vector $\vx$ that can be written as $$ \vx = s \vu + t \vv $$ for some scalars $s$ and $t$. Since $\vu$ and $\vv$ are not parallel, this is the plane through the origin in $\R^3$ with direction vectors $\vu$ and $\vv$.
Example: The line $\{ \colll x {2x} {3x} \mid x \in \R \}$ is spanned by $\colll 1 2 3$.
Example: The line $\{ \coll x y \mid y = x/2 + 1 \}$ is not the span of any set of vectors.
Example: We saw that $\span(\coll 1 2)$ and $\span(\coll 1 2, \coll 2 4)$ are both equal to the line through the origin with direction vector $\coll 1 2$, since $\coll 2 4 = 2 \coll 1 2$.
Definition: A set of vectors $\vv_1, \ldots, \vv_k$ is linearly dependent if there are scalars $c_1, \ldots, c_k$, at least one of which is nonzero, such that $$ c_1 \vv_1 + \cdots + c_k \vv_k = \vec 0 . $$ Since at least one of the scalars is non-zero, the corresponding vector can be expressed as a linear combination of the others.
Example: $ \coll {-2} 4 - 2 \coll {-1} 2 + 0 \coll 5 6 = \coll 0 0 $, so the vectors $\coll {-2} 4$, $\coll {-1} 2$ and $\coll 5 6$ are linearly dependent.
Note that either of the first two can be expressed as a linear combination of the other, but the third one is not a linear combination of the first two.
Example: Are the vectors $\ve_1 = \coll 0 1$ and $\ve_2 = \coll 1 0$ linearly dependent?
Solution: If $c \, \ve_1 + d \, \ve_2 = \vec 0$ and $c \neq 0$, then $\ve_1 = -\frac d c \ve_2$, which is not possible. Similarly, if $d \neq 0$, then $\ve_2$ is a multiple of $\ve_1$. So the only way to have $c \, \ve_1 + d \, \ve_2 = \vec 0$ is with $c = d = 0$.
Theorem 2.5: The vectors $\vv_1, \ldots, \vv_k$ are linearly dependent if and only if at least one of them can be expressed as a linear combination of the others.
Proof: We've seen one direction. For the other, if $\vv_k = c_1 \vv_1 + \cdots + c_{k-1} \vv_{k-1}$, then $c_1 \vv_1 + \cdots + c_{k-1} \vv_{k-1} - \vv_k = \vec 0$, so the vectors are linearly dependent. The same argument works if it is a different vector that can be expressed in terms of the others.
Example: Are the vectors $\ve_1$, $\ve_2$ and $\coll 0 0$ linearly dependent?
Solution: They are linearly dependent, since $$ 0 \coll 1 0 + 0 \coll 0 1 + 1 \coll 0 0 = \coll 0 0 . $$ Fact: Any set of vectors containing the zero vector is linearly dependent.
Definition: A set of vectors $\vv_1, \ldots, \vv_k$ is linearly independent if it is not linearly dependent.
Another way to say this is that the system $$ c_1 \vv_1 + \cdots + c_k \vv_k = \vec 0 . $$ has only the trivial solution $c_1 = \cdots = c_k = 0$. (The book calls this Theorem 2.6.)
This is something we know how to figure out! Use row reduction!
Example: Are the vectors $\vu = \colll {-1} 3 2$, $\vv = \colll 2 1 1$ and $\vw = \colll 6 {-4} {-2}$ linearly independent?
That is, does the system $$ c_1 \colll {-1} 3 2 + c_2 \colll 2 1 1 + c_3 \colll 6 {-4} {-2} = \vec 0 $$ have only the trivial solution?
The augmented matrix is $$ \kern-7ex \bmat{rrr|r} \!-1 & 2 & 6 & 0 \\ 3 & 1 & -4 & 0 \\ 2 & 1 & -2 & 0 \emat \quad \text{which row reduces to} \quad \bmat{rrr|r} \!-1 & 2 & 6 & 0 \\ 0 & 1 & 2 & 0 \\ 0 & 0 & 0 & 0 \emat $$ So what's the answer? There are 3 variables and 2 leading variables (the rank is 2), so there is one free variable, which means there are non-trivial solutions. Therefore, the vectors are linearly dependent.
Example: Are the vectors $\vu = \colll {-1} 3 2$, $\vv = \colll 2 1 1$ and $\vw = \colll 6 {-4} {\red{3}}$ linearly independent?
That is, does the system $$ c_1 \colll {-1} 3 2 + c_2 \colll 2 1 1 + c_3 \colll 6 {-4} {\red{3}} = \vec 0 $$ have only the trivial solution?
The augmented matrix is $$ \kern-7ex \bmat{rrr|r} -1 & 2 & 6 & 0 \\ 3 & 1 & -4 & 0 \\ 2 & 1 & \red{3} & 0 \emat \quad \text{which row reduces to} \quad \bmat{rrr|r} -1 & 2 & 6 & 0 \\ 0 & 1 & 2 & 0 \\ 0 & 0 & \red{1} & 0 \emat $$ So what's the answer? There are 3 variables and 3 leading variables (the rank is 3), so there are no free variables, which means there is only the trivial solution. Therefore, the vectors are linearly independent.
Example 2.24: Are the standard unit vectors $\ve_1, \ldots, \ve_n$ in $\R^n$ linearly independent?
Solution: The augmented matrix is $$ \bmat{cccc|c} 1 & 0 & 0 & \cdots & 0 \\ 0 & 1 & 0 & \cdots & 0 \\ 0 & 0 & 1 & \cdots & 0 \\ \vdots & & & & 0 \\ 0 & 0 & \cdots & 1 & 0 \emat $$ with $n$ rows and $n$ variables. The rank is $n$, so there is only the trivial solution. So the standard unit vectors are linearly independent.
Note: You can sometimes see by inspection that some vectors are linearly dependent, e.g. if one of them is the zero vector, or if one is a scalar multiple of another. Here's one other situation:
Theorem 2.8: If $m > n$, then any set of $m$ vectors in $\R^n$ is linearly dependent.
Proof: The system is a homogeneous system with $m$ variables and $n$ equations. By Theorem 2.3, a homogeneous system with more variables than equations always has a non-trivial solution.
Example: The vectors $\coll 1 2, \coll 3 4, \coll 5 6$ must be linearly dependent. No work required, unless you want to know how they are dependent.
Above we put vectors into the columns of a matrix in order to determine whether they are linearly dependent. There is an alternate approach, putting the vectors into the rows.
Example like 2.25: Consider the same three vectors we used earlier, this time written as row vectors: $\vu = [-1, 3, 2]$, $\vv = [2, 1, 1]$ and $\vw = [6,-4,-2]$. Let's row reduce the matrix that has these vectors as rows, giving new names to the new rows: $$ \kern-8ex \begin{aligned} \bmat{rrr} -1 & 3 & 2 \\ 2 & 1 & 1 \\ 6 & -4 & -2 \emat \lra{\mystack{R_2' = R_2 + 2R_1}{R_3' = R_3 + 6 R_1}} &\bmat{rrr} -1 & 3 & 2 \\ 0 & 7 & 5 \\ 0 & 14 & 10 \emat \\ \phantom{2}\qquad\lra{\displaystyle R_3'' = R_3' - 2 R_2'} &\bmat{rrr} -1 & 3 & 2 \\ 0 & 7 & 5 \\ 0 & \ph 0 & \ph 0 \emat \\ \end{aligned} $$ We got a zero row at the end, so we find that $$ \kern-6ex \begin{aligned} \vec 0 = R_3''\ &= R_3' - 2 R_2' \\ &= (R_3 + 6 R_1) - 2(R_2 + 2 R_1) \\ &= 2 R_1 - 2 R_2 + R_3 = 2 \vu - 2 \vv + \vw \end{aligned} $$ which shows that the original row vectors are linearly dependent. This works in general:
Theorem 2.7: Let $\vv_1, \vv_2, \ldots, \vv_m$ be row vectors in $\R^n$, and let $A$ be the $m \times n$ matrix whose rows are these vectors. Then $\vv_1, \vv_2, \ldots, \vv_m$ are linearly dependent if and only if $\rank(A) < m$.
Proof: Suppose that the rank of $A$ is less than $m$. Then some sequence of row operations will produce a zero row in $A$. As in the example above, this means that you can write the zero vector as a linear combination of the original rows. One can show that the coefficients won't all be zero, so it follows that the vectors are linearly dependent.
On the other hand, if the vectors are linearly dependent, then one of them can be written as a linear combination of the others. For example, suppose $\vv_m = c_1 \vv_1 + c_2 \vv_2 + \cdots + c_{m-1} \vv_{m-1}$. Then if you do the row operations $R_m - c_1 R_1$, $\ldots$, $R_m - c_{m-1} R_{m-1}$, you will produce a zero row. So the rank of $A$ must be less than $m$. The same idea works if a different vector is a linearly combination of the others.$\quad\Box$
Normally the method of putting the vectors in the columns (Theorem 2.5) is used, but for certain questions, the row method is important.
Question: Do the vectors $\vu = \colll 1 1 2$ and $\vv = \colll 2 1 3$ span $\R^3$? If not, find a vector not in their span.
Question: Are the same two vectors linearly independent?
Question: Suppose that the rows of an $m \times n$ matrix $A$ are linearly independent. What can you say about the rank of $A$?
Question: Suppose that the columns of an $m \times n$ matrix $A$ are linearly independent. What can you say about the rank of $A$?
The intuition behind linear dependence:
Linear dependence captures the idea that there is redundancy in the set of vectors: a smaller set will have the same span. Put another way, the vectors will span something smaller than you expect:
Typically, two vectors will span a plane; but if one is a multiple of the other one, then they will only span a line.
Typically, three vectors in $\R^3$ will span all of $\R^3$; but if one is a linear combination of the others, then they will span a plane (or something smaller).
Typically, one vector spans a line. But if it is the zero vector, its span consists of only the origin.