Math 1600 Lecture 11, Section 2, 29 Sep 2014

$ \newcommand{\bdmat}[1]{\left|\begin{array}{#1}} \newcommand{\edmat}{\end{array}\right|} \newcommand{\bmat}[1]{\left[\begin{array}{#1}} \newcommand{\emat}{\end{array}\right]} \newcommand{\coll}[2]{\bmat{r} #1 \\ #2 \emat} \newcommand{\ccoll}[2]{\bmat{c} #1 \\ #2 \emat} \newcommand{\colll}[3]{\bmat{r} #1 \\ #2 \\ #3 \emat} \newcommand{\ccolll}[3]{\bmat{c} #1 \\ #2 \\ #3 \emat} \newcommand{\collll}[4]{\bmat{r} #1 \\ #2 \\ #3 \\ #4 \emat} \newcommand{\ccollll}[4]{\bmat{c} #1 \\ #2 \\ #3 \\ #4 \emat} \newcommand{\colllll}[5]{\bmat{r} #1 \\ #2 \\ #3 \\ #4 \\ #5 \emat} \newcommand{\ccolllll}[5]{\bmat{c} #1 \\ #2 \\ #3 \\ #4 \\ #5 \emat} \newcommand{\red}[1]{{\color{red}#1}} \newcommand{\lra}[1]{\mbox{$\xrightarrow{#1}$}} \newcommand{\rank}{\textrm{rank}} \newcommand{\row}{\textrm{row}} \newcommand{\col}{\textrm{col}} \newcommand{\null}{\textrm{null}} \newcommand{\nullity}{\textrm{nullity}} \renewcommand{\Re}{\operatorname{Re}} \renewcommand{\Im}{\operatorname{Im}} \renewcommand{\Arg}{\operatorname{Arg}} \renewcommand{\arg}{\operatorname{arg}} \newcommand{\adj}{\textrm{adj}} \newcommand{\mystack}[2]{\genfrac{}{}{0}{0}{#1}{#2}} \newcommand{\mystackthree}[3]{\mystack{\mystack{#1}{#2}}{#3}} \newcommand{\qimplies}{\quad\implies\quad} \newcommand{\qtext}[1]{\quad\text{#1}\quad} \newcommand{\qqtext}[1]{\qquad\text{#1}\qquad} \newcommand{\svec}[1]{\,\vec{#1}} \newcommand{\querytext}[1]{\toggle{\text{?}\vphantom{\text{#1}}}{\text{#1}}\endtoggle} \newcommand{\query}[1]{\toggle{\text{?}\vphantom{#1}}{#1}\endtoggle} \newcommand{\smallquery}[1]{\toggle{\text{?}}{#1}\endtoggle} \newcommand{\bv}{\mathbf{v}} %\require{AMScd} $

Announcements:

Read Section 2.4 for next class, but just the parts on network flow and electrical networks. We aren't covering the rest. Work through recommended homework questions.

Quiz 3 is this week, and will cover the material until the end of Section 2.3.

Office hour: today, 3:00-3:30, MC103B.

Help Centers: Monday-Friday 2:30-6:30 in MC 106.

Partial review of Lecture 10:

Linear combinations

Definition: A vector $\vv$ is a linear combination of vectors $\vv_1, \vv_2, \ldots, \vv_k$ if there exist scalars $c_1, c_2, \ldots, c_k$ (called coefficients) such that $$ c_1 \vv_1 + \cdots + c_k \vv_k = \vv . $$

Example: Is $\colll 4 8 6$ a linear combination of $\colll 4 5 6$ and $\colll 2 1 3$?

That is, can we find scalars $x$ and $y$ such that $$x \colll 4 5 6 + y \colll 2 1 3 = \colll 4 8 6 ?$$

Expanding this into components, this becomes a linear system $$ \kern-7ex \begin{aligned} 4 x + 2 y\ &= 4 \\ 5 x + \ph y\ &= 8 \\ 6 x + 3 y\ &= 6 \end{aligned} \quad\text{with augmented matrix}\quad \bmat{rr|r} 4 & 2 & 4 \\ 5 & 1 & 8 \\ 6 & 3 & 6 \emat $$ and we already know how to determine whether this system is consistent: use row reduction!

Theorem 2.4: A system with augmented matrix $[A \mid \vb \,]$ is consistent if and only if $\vb$ is a linear combination of the columns of $A$.

This gives a different geometrical way to understand the solutions to a system.

 

Spanning Sets of Vectors

Definition: If $S = \{ \vv_1, \ldots, \vv_k \}$ is a set of vectors in $\R^n$, then the set of all linear combinations of $\vv_1, \ldots, \vv_k$ is called the span of $\vv_1, \ldots, \vv_k$ and is denoted $\span(\vv_1, \ldots, \vv_k)$ or $\span(S)$.

If $\span(S) = \R^n$, then $S$ is called a spanning set for $\R^n$.

Example: $\span(\ve_1, \ve_2, \ldots, \ve_n) = \R^n$.

Example: The span of $\vu = \colll 1 2 3$ and $\vv = \colll 4 5 6$ consists of every vector $\vx$ that can be written as $$ \vx = s \vu + t \vv $$ for some scalars $s$ and $t$. Since $\vu$ and $\vv$ are not parallel, this is the plane through the origin in $\R^3$ with direction vectors $\vu$ and $\vv$.

Example: The line $\{ \colll x {2x} {3x} \mid x \in \R \}$ is spanned by $\colll 1 2 3$.

Question: What vector is always in $\span(\vv_1, \vv_2, \ldots, \vv_k)^\strut$?

Example: The line $\{ \coll x y \mid y = x/2 + 1 \}$ is not the span of any set of vectors.

Example: We saw that $\span(\coll 1 2)$ and $\span(\coll 1 2, \coll 2 4)$ are both equal to the line through the origin with direction vector $\coll 1 2$, since $\coll 2 4 = 2 \coll 1 2$.

New material: Section 2.3: Spanning Sets and Linear Independence

Linear Dependence and Independence

Suppose that we have vectors $\vu$, $\vv$ and $\vw$ in $\R^n$ such that $\vu + 3 \vv - 2 \vw = \vec 0$. This can be solved for any of the vectors in terms of the others, e.g. $\vu = -3\vv + 2\vw$. This means that $\span(\vu, \vv, \vw) = \span(\vv, \vw)$. For example, $$ \kern-6ex \begin{aligned} a \vu + b \vv + c \vw\ &= a (-3\vv+2\vw) + b \vv + c \vw \\ &= (-3a+b) \vv + (2a+c)\vw \end{aligned} $$ So it is redundant to include $\vu$. We'd like to be able to determine when our spanning sets have too many vectors.

Definition: A set of vectors $\vv_1, \ldots, \vv_k$ is linearly dependent if there are scalars $c_1, \ldots, c_k$, at least one of which is nonzero, such that $$ c_1 \vv_1 + \cdots + c_k \vv_k = \vec 0 . $$ Since at least one of the scalars is non-zero, the corresponding vector can be expressed as a linear combination of the others.

Example: $ \coll {-2} 4 - 2 \coll {-1} 2 + 0 \coll 5 6 = \coll 0 0 $, so the vectors $\coll {-2} 4$, $\coll {-1} 2$ and $\coll 5 6$ are linearly dependent.

Note that either of the first two can be expressed as a linear combination of the other, but the third one is not a linear combination of the first two.

Example: Are the vectors $\ve_1 = \coll 0 1$ and $\ve_2 = \coll 1 0$ linearly dependent?

Solution:

Theorem 2.5: The vectors $\vv_1, \ldots, \vv_k$ are linearly dependent if and only if at least one of them can be expressed as a linear combination of the others.

Proof: We've seen one direction. For the other, if $\vv_k = c_1 \vv_1 + \cdots + c_{k-1} \vv_{k-1}$, then $c_1 \vv_1 + \cdots + c_{k-1} \vv_{k-1} - \vv_k = \vec 0$, so the vectors are linearly dependent. The same argument works if it is a different vector that can be expressed in terms of the others.

Example: Are the vectors $\ve_1$, $\ve_2$ and $\coll 0 0$ linearly dependent?

Solution:

Definition: A set of vectors $\vv_1, \ldots, \vv_k$ is linearly independent if it is not linearly dependent.

Another way to say this is that the system $$ c_1 \vv_1 + \cdots + c_k \vv_k = \vec 0 . $$ has only the trivial solution $c_1 = \cdots = c_k = 0$. (The book calls this Theorem 2.6.)

This is something we know how to figure out! Use row reduction!

Example: Are the vectors $\vu = \colll {-1} 3 2$, $\vv = \colll 2 1 1$ and $\vw = \colll 6 {-4} {-2}$ linearly independent?

That is, does the system $$ c_1 \colll {-1} 3 2 + c_2 \colll 2 1 1 + c_3 \colll 6 {-4} {-2} = \vec 0 $$ have only the trivial solution?

The augmented matrix is $$ \kern-7ex \bmat{rrr|r} \!-1 & 2 & 6 & 0 \\ 3 & 1 & -4 & 0 \\ 2 & 1 & -2 & 0 \emat \quad \text{which row reduces to} \quad \bmat{rrr|r} \!-1 & 2 & 6 & 0 \\ 0 & 1 & 2 & 0 \\ 0 & 0 & 0 & 0 \emat $$ So what's the answer?

Example: Are the vectors $\vu = \colll {-1} 3 2$, $\vv = \colll 2 1 1$ and $\vw = \colll 6 {-4} {\red{3}}$ linearly independent?

That is, does the system $$ c_1 \colll {-1} 3 2 + c_2 \colll 2 1 1 + c_3 \colll 6 {-4} {\red{3}} = \vec 0 $$ have only the trivial solution?

The augmented matrix is $$ \kern-7ex \bmat{rrr|r} -1 & 2 & 6 & 0 \\ 3 & 1 & -4 & 0 \\ 2 & 1 & \red{3} & 0 \emat \quad \text{which row reduces to} \quad \bmat{rrr|r} -1 & 2 & 6 & 0 \\ 0 & 1 & 2 & 0 \\ 0 & 0 & \red{1} & 0 \emat $$ So what's the answer?

Example 2.24: Are the standard unit vectors $\ve_1, \ldots, \ve_n$ in $\R^n$ linearly independent?

Solution:

Note: You can sometimes see by inspection that some vectors are linearly dependent, e.g. if one of them is the zero vector, or if one is a scalar multiple of another. Here's one other situation:

Theorem 2.8: If $m > n$, then any set of $m$ vectors in $\R^n$ is linearly dependent.

Proof: The system is a homogeneous system with $m$ variables and $n$ equations. By Theorem 2.3, a homogeneous system with more variables than equations always has a non-trivial solution.

Example: The vectors $\coll 1 2, \coll 3 4, \coll 5 6$ must be linearly dependent. No work required, unless you want to know how they are dependent.


Above we put vectors into the columns of a matrix in order to determine whether they are linearly dependent. There is an alternate approach, putting the vectors into the rows.

Example like 2.25: Consider the same three vectors we used earlier, this time written as row vectors: $\vu = [-1, 3, 2]$, $\vv = [2, 1, 1]$ and $\vw = [6,-4,-2]$. Let's row reduce the matrix that has these vectors as rows, giving new names to the new rows: $$ \kern-8ex \begin{aligned} \bmat{rrr} -1 & 3 & 2 \\ 2 & 1 & 1 \\ 6 & -4 & -2 \emat \lra{\mystack{R_2' = R_2 + 2R_1}{R_3' = R_3 + 6 R_1}} &\bmat{rrr} -1 & 3 & 2 \\ 0 & 7 & 5 \\ 0 & 14 & 10 \emat \\ \phantom{2}\qquad\lra{\displaystyle R_3'' = R_3' - 2 R_2'} &\bmat{rrr} -1 & 3 & 2 \\ 0 & 7 & 5 \\ 0 & \ph 0 & \ph 0 \emat \\ \end{aligned} $$ We got a zero row at the end, so we find that $$ \kern-6ex \begin{aligned} \vec 0 = R_3''\ &= R_3' - 2 R_2' \\ &= (R_3 + 6 R_1) - 2(R_2 + 2 R_1) \\ &= 2 R_1 - 2 R_2 + R_3 = 2 \vu - 2 \vv + \vw \end{aligned} $$ which shows that the original row vectors are linearly dependent. This works in general:

Theorem 2.7: Let $\vv_1, \vv_2, \ldots, \vv_m$ be row vectors in $\R^n$, and let $A$ be the $m \times n$ matrix whose rows are these vectors. Then $\vv_1, \vv_2, \ldots, \vv_m$ are linearly dependent if and only if $\rank(A) < m$.

Proof: Suppose that the rank of $A$ is less than $m$. Then some sequence of row operations will produce a zero row in $A$. As in the example above, this means that you can write the zero vector as a linear combination of the original rows. One can show that the coefficients won't all be zero, so it follows that the vectors are linearly dependent.

On the other hand, if the vectors are linearly dependent, then one of them can be written as a linear combination of the others. For example, suppose $\vv_m = c_1 \vv_1 + c_2 \vv_2 + \cdots + c_{m-1} \vv_{m-1}$. Then if you do the row operations $R_m - c_1 R_1$, $\ldots$, $R_m - c_{m-1} R_{m-1}$, you will produce a zero row. So the rank of $A$ must be less than $m$. The same idea works if a different vector is a linearly combination of the others.$\quad\Box$

Normally the method of putting the vectors in the columns (Theorem 2.5) is used, but for certain questions, the row method is important.


Question: Do the vectors $\vu = \colll 1 1 2$ and $\vv = \colll 2 1 3$ span $\R^3$? If not, find a vector not in their span.

Question: Are the same two vectors linearly independent?

Question: Suppose that the rows of an $m \times n$ matrix $A$ are linearly independent. What can you say about the rank of $A$?

Question: Suppose that the columns of an $m \times n$ matrix $A$ are linearly independent. What can you say about the rank of $A$?


The intuition behind linear dependence:

Linear dependence captures the idea that there is redundancy in the set of vectors: a smaller set will have the same span. Put another way, the vectors will span something smaller than you expect:

Typically, two vectors will span a plane; but if one is a multiple of the other one, then they will only span a line.

Typically, three vectors in $\R^3$ will span all of $\R^3$; but if one is a linear combination of the others, then they will span a plane (or something smaller).

Typically, one vector spans a line. But if it is the zero vector, its span consists of only the origin.