Math 1600 Lecture 10, Section 002, 27 Sep 2024

$ \newcommand{\bdmat}[1]{\left|\begin{array}{#1}} \newcommand{\edmat}{\end{array}\right|} \newcommand{\bmat}[1]{\left[\begin{array}{#1}} \newcommand{\emat}{\end{array}\right]} \newcommand{\coll}[2]{\bmat{r} #1 \\ #2 \emat} \newcommand{\ccoll}[2]{\bmat{c} #1 \\ #2 \emat} \newcommand{\colll}[3]{\bmat{r} #1 \\ #2 \\ #3 \emat} \newcommand{\ccolll}[3]{\bmat{c} #1 \\ #2 \\ #3 \emat} \newcommand{\collll}[4]{\bmat{r} #1 \\ #2 \\ #3 \\ #4 \emat} \newcommand{\ccollll}[4]{\bmat{c} #1 \\ #2 \\ #3 \\ #4 \emat} \newcommand{\colllll}[5]{\bmat{r} #1 \\ #2 \\ #3 \\ #4 \\ #5 \emat} \newcommand{\ccolllll}[5]{\bmat{c} #1 \\ #2 \\ #3 \\ #4 \\ #5 \emat} \newcommand{\red}[1]{{\color{red}#1}} \newcommand{\blue}[1]{{\color{blue}#1}} \newcommand{\lra}[1]{\mbox{$\xrightarrow{#1}$}} \newcommand{\rank}{\textrm{rank}} \newcommand{\row}{\textrm{row}} \newcommand{\col}{\textrm{col}} \newcommand{\null}{\textrm{null}} \newcommand{\nullity}{\textrm{nullity}} \renewcommand{\Re}{\operatorname{Re}} \renewcommand{\Im}{\operatorname{Im}} \renewcommand{\Arg}{\operatorname{Arg}} \renewcommand{\arg}{\operatorname{arg}} \newcommand{\adj}{\textrm{adj}} \newcommand{\mystack}[2]{\genfrac{}{}{0}{0}{#1}{#2}} \newcommand{\mystackthree}[3]{\mystack{\mystack{#1}{#2}}{#3}} \newcommand{\qimplies}{\quad\implies\quad} \newcommand{\qtext}[1]{\quad\text{#1}\quad} \newcommand{\qqtext}[1]{\qquad\text{#1}\qquad} \newcommand{\smalltext}[1]{{\small\text{#1}}} \newcommand{\svec}[1]{\,\vec{#1}} \newcommand{\querytext}[1]{\toggle{\blue{\text{?}}\vphantom{\text{#1}}}{\text{#1}}\endtoggle} \newcommand{\query}[1]{\toggle{\blue{\text{?}}\vphantom{#1}}{#1}\endtoggle} \newcommand{\smallquery}[1]{\toggle{\blue{\text{?}}}{#1}\endtoggle} \newcommand{\bv}{\mathbf{v}} \newcommand{\cyc}[2]{\cssId{#1}{\style{visibility:hidden}{#2}}} $

Announcements:

Today we start Section 2.3. Continue reading Section 2.3 for next class. Lots of new concepts in 2.3, so read the text. Work through suggested exercises.

Homework 3 is due today at 11:55pm on WeBWorK.

Homework 4 will be on Gradescope. I'll give a quick demo on Wednesday.

No classes on Monday.

Math Help Centre: M-F 12:30-5:30 in PAB48/49 and online 6pm-8pm.

My next office hour is today 2:30-3:20 in MC130. Also on Monday, 3:30-4:00 in MC130.

Review of Section 2.2, Lectures 8 and 9:

Associated to a system of linear equations is an augmented matrix $[A \mid \vb\, ]$. We call $A$ the coefficient matrix.

Performing the following elementary row operations on the augmented matrix doesn't change the solution set:

  1. Exchange two rows.
  2. Multiply a row by a nonzero constant.
  3. Add a multiple of one row to another.

Definition: A matrix is in row echelon form (REF) if it satisfies:

  1. Any rows that are entirely zero are at the bottom.
  2. In each nonzero row, the first nonzero entry (called the leading entry) is further to the right than any leading entries above it.

Definition: A matrix is in reduced row echelon form (RREF) if:

  1. It is in row echelon form.
  2. The leading entry in each nonzero row is a 1 (called a leading 1).
  3. Each column containing a leading 1 is zero everywhere else.

We can always use the elementary row operations to get a matrix into REF and RREF:

Row reduction steps: (This technique is crucial for the whole course.)
  1. Find the leftmost column that is not all zeros.
  2. If the top entry is zero, exchange rows to make it nonzero.
  3. It may be convenient to scale this row to make the leading entry into a 1, or to exchange rows to get a 1 here. For RREF, it is almost always best to do this now.
  4. Use the leading entry to create zeros below it, and above it for RREF.
  5. Cover up the row containing the leading entry, and repeat starting from step (a).

Note: Row echelon form is not unique, but reduced row echelon form is.

Gaussian elimination: This means to do row reduction on the augmented matrix until you get to row echelon form, and then use back substitution to find the solutions.

Gauss-Jordan elimination: This means to do row reduction on the augmented matrix until you get to reduced row echelon form, and then use back substitution to find the solutions.

Back substitution: We call the variables corresponding to a column with a leading entry the leading variables, and the remaining variables the free variables. We solve for the leading variables in terms of the free variables, and assign parameters $s$, $t$, etc. to the free variables.

Definition: For any matrix $A$, the rank of $A$ is the number of nonzero rows in its row echelon form. It is written $\rank(A)$. (We'll see later that this is the same for all row echelon forms of $A$.)

Note: The number of leading variables equals the rank of the coefficient matrix.

Theorem 2.2: Let $A$ be the coefficient matrix of a linear system with $n$ variables. If the system is consistent, then $$ \text{number of free variables} = n - \rank(A) . $$ When there are 0 free variables, we have a unique solution.
When there are 1 or more free variables, we have infinitely many solutions.

Consistency: You can tell whether the system is consistent or inconsistent from the row echelon form of the augmented matrix:

  1. If one of the rows is zero except for the last entry, then the system is inconsistent.
  2. If this doesn't happen, then the system is consistent, and Theorem 2.2 applies.

Homogeneous Systems

Definition: A system of linear equations is homogeneous if the constant term in each equation is zero.

Theorem 2.3: A homogeneous system $[ A \mid \vec 0 \, ]$ is always consistent. Moreover, if there are $m$ equations and $n$ variables and $m < n$, then the system has infinitely many solutions.

Note: If $m \geq n$ the system may have infinitely many solutions or it may have only the zero solution.

Here is an applet for practicing row reduction.

We aren't covering solving systems over $\Z_p$.


New material: Section 2.3: Spanning Sets and Linear Independence

Linear combinations

Recall: A vector $\vv$ is a linear combination of vectors $\vv_1, \vv_2, \ldots, \vv_k$ if there exist scalars $c_1, c_2, \ldots, c_k$ (called coefficients) such that $$ c_1 \vv_1 + \cdots + c_k \vv_k = \vv . $$

Example: Is $\colll 4 8 6$ a linear combination of $\colll 4 5 6$ and $\colll 2 1 3$?

That is, can we find scalars $x$ and $y$ such that $$x \colll 4 5 6 + y \colll 2 1 3 = \colll 4 8 6 ?$$

Expanding this into components, this becomes a linear system $$ \begin{aligned} 4 x + 2 y\ &= 4 \\ 5 x + \ph y\ &= 8 \\ 6 x + 3 y\ &= 6 \end{aligned} $$ and we already know how to determine whether this system is consistent: use row reduction!

The augmented matrix is $$ \kern-8ex \bmat{rr|r} 4 & 2 & 4 \\ 5 & 1 & 8 \\ 6 & 3 & 6 \emat \quad \Leftarrow \small\text{Note that the vectors appear as the columns here.} $$ This has row echelon form (work omitted): $$ \bmat{rr|r} 1 & 1/2 & 1 \\ 0 & -3 & 6 \\ 0 & 0 & 0 \emat . $$ From this, we can already see that the system is consistent, so the answer is YES.

If we want to find $x$ and $y$, we can use back substitution (maybe first going to RREF), and we find that $x = 2$ and $y = -2$ is the unique solution. (Do this at home.)

Example: Is $\colll 4 8 8$ a linear combination of $\colll 4 5 6$ and $\colll 2 1 3$?

Solution: The augmented matrix $$ \bmat{rr|r} 4 & 2 & 4 \\ 5 & 1 & 8 \\ 6 & 3 & \red{8} \emat $$ has row echelon form $$ \bmat{rr|r} 1 & 1/2 & 1 \\ 0 & -3 & 6 \\ 0 & 0 & \red{2} \emat $$ and so the system is inconsistent and the answer is NO.

Theorem 2.4: A system with augmented matrix $[A \mid \vb \,]$ is consistent if and only if $\vb$ is a linear combination of the columns of $A$.

This gives a different geometrical way to understand the solutions to a system. For example, consider the following system from Lecture 7: $$ \href{javascript:divShow("system1");divHide("system2");divHide("system3")}{ \begin{aligned} \phm x + y\ &= 2\\ - x + y\ &= 4 \end{aligned} } $$ We already know that we can interpret this as finding the point of intersection of two lines in $\R^2$, and so in this case we get a unique solution ($x = -1$, $y = 3$).

But we can also interpret this as writing $\coll 2 4$ as a linear combination of $\coll 1 {-1}$ and $\coll 1 1$, which has a different geometric interpretation.

$ \qquad\begin{aligned} \phm x + y\ &= 2\\ - x + y\ &= 4 \end{aligned} $

$\quad x \coll 1 {-1} + y \coll 1 1 = \coll 2 4$

$\quad \query{x = -1, \quad y = 3}$

Consider also these systems: $$ \href{javascript:divHide("system1");divShow("system2");divHide("system3")}{ \begin{aligned} \ph x - \ph y\ &= 2\\ 2 x - 2 y\ &= 4 \end{aligned} } \qquad\qquad \href{javascript:divHide("system1");divHide("system2");divShow("system3")}{ \begin{aligned} x + 2 y\ &= 2\\ x + 2 y\ &= 3 \end{aligned} } $$ Show all

 

Spanning Sets of Vectors

Definition: If $S = \{ \vv_1, \ldots, \vv_k \}$ is a set of vectors in $\R^n$, then the set of all linear combinations of $\vv_1, \ldots, \vv_k$ is called the span of $\vv_1, \ldots, \vv_k$ and is denoted $\span(\vv_1, \ldots, \vv_k)$ or $\span(S)$.

If $\span(S) = \R^n$, then $S$ is called a spanning set for $\R^n$.

Example: The vectors $\ve_1 = \coll 1 0$ and $\ve_2 = \coll 0 1$ are a spanning set for $\R^2$, since for any vector $\vx = \coll a b$ we have $$ a \coll 1 0 + b \coll 0 1 = \coll a b . $$ Another way to see this is that the augmented matrix associated to $\ve_1$, $\ve_2$ and $\vx$ is $$ \bmat{rr|r} 1 & 0 & a \\ 0 & 1 & b \emat $$ which is already in RREF and is consistent.

Similarly, the standard unit vectors in $\R^n$ are a spanning set for $\R^n$.

Example: Find the span of $\vu = \colll 1 2 3$.

Solution: The span consists of every vector $\vx$ that can be written as $\vx = s \vu$ for some scalar $s$. So it is the line through the origin with direction vector $\vu$.

Example: Find the span of $\vu = \colll 1 2 3$ and $\vv = \colll 4 5 6$.

Solution: The span consists of every vector $\vx$ that can be written as $$ \vx = s \vu + t \vv $$ for some scalars $s$ and $t$. Since $\vu$ and $\vv$ are not parallel, this is the plane through the origin in $\R^3$ with direction vectors $\vu$ and $\vv$.

Example 10-1: What is the span of $\coll 1 3$ and $\coll 2 4$? They are not parallel, so intuitively their linear combinations should fill out all of $\R^2$. On board: We can see this algebraically, by row reducing the augmented matrix $$ \bmat{rr|r} 1 & 2 & a \\ 3 & 4 & b \emat $$

Note: The word "span" is really just a fancy way of saying "all linear combinations of these vectors".

Question: What is $\span(\coll 1 2)$? What is $\span(\coll 1 2, \coll 2 4)$? $$\querytext{Both are a line through the origin in the direction $[1, 2]$.}$$

Question: We saw that $\span(\coll 1 0, \coll 0 1) = \R^2$. What is $\span(\coll 1 0, \coll 0 1, \coll 2 4)$? $$\querytext{Also $\R^2$.}$$

Question: What vector is always in $\span(\vv_1, \vv_2, \ldots, \vv_k)$? $$\querytext{The zero vector.}$$

Question: Find some vectors with span equal to $\{ \collll x {2x} {3x} {4x} \mid x \in \R \}$. $$\query{\span(\collll 1 2 3 4) = \{ \collll x {2x} {3x} {4x} \mid x \in \R \}}$$ $$\querytext{Also, $\span(\collll 2 4 6 8) = \{ \collll x {2x} {3x} {4x} \mid x \in \R \}$}$$

Question: Find some vectors with span equal to $\{ \coll x y \mid y = x + 1 \}$.