Math 1600A Lecture 28, Section 2, 15 Nov 2013

$ \newcommand{\bdmat}[1]{\left|\begin{array}{#1}} \newcommand{\edmat}{\end{array}\right|} \newcommand{\bmat}[1]{\left[\begin{array}{#1}} \newcommand{\emat}{\end{array}\right]} \newcommand{\coll}[2]{\bmat{r} #1 \\ #2 \emat} \newcommand{\ccoll}[2]{\bmat{c} #1 \\ #2 \emat} \newcommand{\colll}[3]{\bmat{r} #1 \\ #2 \\ #3 \emat} \newcommand{\ccolll}[3]{\bmat{c} #1 \\ #2 \\ #3 \emat} \newcommand{\collll}[4]{\bmat{r} #1 \\ #2 \\ #3 \\ #4 \emat} \newcommand{\ccollll}[4]{\bmat{c} #1 \\ #2 \\ #3 \\ #4 \emat} \newcommand{\colllll}[5]{\bmat{r} #1 \\ #2 \\ #3 \\ #4 \\ #5 \emat} \newcommand{\ccolllll}[5]{\bmat{c} #1 \\ #2 \\ #3 \\ #4 \\ #5 \emat} \newcommand{\red}[1]{{\color{red}#1}} \newcommand{\lra}[1]{\mbox{$\xrightarrow{#1}$}} \newcommand{\rank}{\textrm{rank}} \newcommand{\row}{\textrm{row}} \newcommand{\col}{\textrm{col}} \newcommand{\null}{\textrm{null}} \newcommand{\nullity}{\textrm{nullity}} \renewcommand{\Re}{\operatorname{Re}} \renewcommand{\Im}{\operatorname{Im}} \renewcommand{\Arg}{\operatorname{Arg}} \renewcommand{\arg}{\operatorname{arg}} \newcommand{\adj}{\textrm{adj}} \newcommand{\mystack}[2]{\genfrac{}{}{0}{0}{#1}{#2}} \newcommand{\mystackthree}[3]{\mystack{\mystack{#1}{#2}}{#3}} \newcommand{\qimplies}{\quad\implies\quad} \newcommand{\qtext}[1]{\quad\text{#1}\quad} \newcommand{\qqtext}[1]{\qquad\text{#1}\qquad} \newcommand{\svec}[1]{\,\vec{#1}} \newcommand{\query}[1]{\toggle{\text{?}\vphantom{#1}}{#1}\endtoggle} \newcommand{\smallquery}[1]{\toggle{\text{?}}{#1}\endtoggle} $

Announcements:

Today we finish 4.3. Read Section 4.4 for Monday and also read Appendix D on polynomials (self-study). Work through recommended homework questions.

Tutorials: Quiz 5 is next week.
Office hour: Monday, 1:30-2:30, MC103B.
Help Centers: Monday-Friday 2:30-6:30 in MC 106.

Midterm 2 Solutions are available from the course home page. The average was 27/40 = 68%.

Question: If $P$ is invertible, how do $\det A$ and $\det(P^{-1}AP)$ compare?

Partial review of last class: Section 4.3

Definition: Let $A$ be an $n \times n$ matrix. A scalar $\lambda$ (lambda) is called an eigenvalue of $A$ if there is a nonzero vector $\vx$ such that $A \vx = \lambda \vx$, i.e. $(A - \lambda I) \vx = \vec 0$. Such a vector $\vx$ is called an eigenvector of $A$ corresponding to $\lambda$.

Definition: The collection of all solutions to $(A - \lambda I) \vx = \vec 0$ is a subspace called the eigenspace of $\lambda$ and is denoted $E_\lambda$. In other words, $$ E_\lambda = \null(A - \lambda I) . $$ It consists of the eigenvectors plus the zero vector.

Definition: If $A$ is $n \times n$, $\det (A - \lambda I)$ will be a degree $n$ polynomial in $\lambda$. It is called the characteristic polynomial of $A$, and $\det (A - \lambda I) = 0$ is called the characteristic equation.

By the fundamental theorem of invertible matrices, the solutions to the characteristic equation are exactly the eigenvalues.

Finding eigenvalues and eigenspaces: Let $A$ be an $n \times n$ matrix.

1. Compute the characteristic polynomial $\det(A - \lambda I)$.
2. Find the eigenvalues of $A$ by solving the characteristic equation $\det(A - \lambda I) = 0$.
3. For each eigenvalue $\lambda$, find a basis for $E_\lambda = \null (A - \lambda I)$ by solving the system $(A - \lambda I) \vx = \vec 0$.

So we need to get good at solving polynomial equations. Solutions are called zeros or roots.

Theorem D.4 (The Fundamental Theorem of Algebra): A polynomial of degree $n$ has at most $n$ distinct roots.

Therefore:

Theorem: An $n \times n$ matrix $A$ has at most $n$ distinct eigenvalues.

Also:

Theorem D.2 (The Factor Theorem): Let $f$ be a polynomial and let $a$ be a constant. Then $a$ is a zero of $f(x)$ (i.e. $f(a) = 0$) if and only if $x - a$ is a factor of $f(x)$ (i.e. $f(x) = (x - a) g(x)$ for some polynomial $g$).

The largest $k$ such that $(x-a)^k$ is a factor of $f$ is called the multiplicity of the root $a$ in $f$.

In the case of an eigenvalue, we call its multiplicity in the characteristic polynomial the algebraic multiplicity of this eigenvalue.

We also define the geometric multiplicity of an eigenvalue $\lambda$ to be the dimension of the corresponding eigenspace $E_\lambda$.

New material: 4.3 continued

Theorem 4.15: The eigenvalues of a triangular matrix are the entries on its main diagonal (repeated according to their algebraic multiplicity).

Example: If $A = \bmat{rrr} 1 & 0 & 0 \\ 2 & 3 & 0 \\ 4 & 5 & 1 \emat$, then $$ \kern-6ex \det(A - \lambda I) = \bdmat{ccc} 1-\lambda & 0 & 0 \\ 2 & 3-\lambda & 0 \\ 4 & 5 & 1-\lambda \edmat = (1 - \lambda)^2 (3 - \lambda) , $$ so the eigenvalues are $\lambda = 1$ (with algebraic multiplicity 2) and $\lambda = 3$ (with algebraic multiplicity 1).

Question: What are the eigenvalues of a diagonal matrix?

Question: What are the eigenvalues of $\bmat{cc} 0 & 4 \\ 1 & 0 \emat$?

Question: How can we tell whether a matrix $A$ is invertible using eigenvalues?

So we can extend the fundamental theorem with two new entries:

Theorem 4.17: Let $A$ be an $n \times n$ matrix. The following are equivalent:
a. $A$ is invertible.
b. $A \vx = \vb$ has a unique solution for every $\vb \in \R^n$.
c. $A \vx = \vec 0$ has only the trivial (zero) solution.
d. The reduced row echelon form of $A$ is $I_n$.
f. $\rank(A) = n$
g. $\nullity(A) = 0$
h. The columns of $A$ are linearly independent.
i. The columns of $A$ span $\R^n$.
j. The columns of $A$ are a basis for $\R^n$.
k. The rows of $A$ are linearly independent.
l. The rows of $A$ span $\R^n$.
m. The rows of $A$ are a basis for $\R^n$.
n. $\det A \neq 0$
o. $0$ is not an eigenvalue of $A$

Eigenvalues of powers and inverses

Suppose $\vx$ is an eigenvector of $A$ with eigenvalue $\lambda$. What can we say about $A^2$? $A^3$? If $A$ is invertible, how about the eigenvalues/vectors of $A^{-1}$? On whiteboard.

We've shown:

In contrast to some other recent results, this one is very useful computationally:

Example 4.21: Compute $\bmat{rr} 0 & 1 \\ 2 & 1 \emat^{10} \coll 5 1$.

Solution: By finding the eigenspaces of the matrix, we can show that $$ \bmat{rr} 0 & 1 \\ 2 & 1 \emat \coll 1 {-1} = - \coll 1 {-1} \qtext{and} \bmat{rr} 0 & 1 \\ 2 & 1 \emat \coll 1 2 = 2 \coll 1 2 $$ Write $A = \bmat{rr} 0 & 1 \\ 2 & 1 \emat$, $\vx = \coll 5 1$, $\vv_1 = \coll 1 {-1}$ and $\vv_2 = \coll 1 2$. Since $\vx = 3 \vv_1 + 2 \vv_2$ we have $$ \begin{aligned} A^{10} \vx &= A^{10} (3 \vv_1 + 2 \vv_2) = 3 A^{10} \vv_1 + 2 A^{10} \vv_2 \\ &= 3 (-1)^{10} \vv_1 + 2(2^{10}) \vv_2 = \coll {3+2^{11}}{-3+2^{12}} \end{aligned} $$ Much faster than repeated matrix multiplication, especially if $10$ is replaced with $100$.

This raises an interesting question. In the example, the eigenvectors were a basis for $\R^2$, so we could use this method to compute $A^k \vx$ for any $\vx$. However, last class we saw a $3 \times 3$ matrix with two one-dimensional eigenspaces, so the eigenvectors didn't span $\R^3$. We will study this further in Section 4.4, but right now we can answer a related question about linear independence.

Theorem: If $\vv_1, \vv_2, \ldots, \vv_m$ are eigenvectors of $A$ corresponding to distinct eigenvalues $\lambda_1, \lambda_2, \ldots, \lambda_m$, then $\vv_1, \vv_2, \ldots, \vv_m$ are linearly independent.

Proof in case $m = 2$: If $\vv_1$ and $\vv_2$ are linearly dependent, then $\vv_1 = c \vv_2$ for some $c$. Therefore $$ A \vv_1 = A \, c \vv_2 = c A \vv_2 $$ so $$ \lambda_1 \vv_1 = c \lambda_2 \vv_2 = \lambda_2 \vv_1 $$ Since $\vv_1 \neq \vec 0$, this forces $\lambda_1 = \lambda_2$, a contradiction.$\quad\Box$

The general case is very similar; see text.

Next: how to become a Billionaire using the material from this course.