Math 1600 Lecture 15, Section 002, 11 Oct 2024

$ \newcommand{\bdmat}[1]{\left|\begin{array}{#1}} \newcommand{\edmat}{\end{array}\right|} \newcommand{\bmat}[1]{\left[\begin{array}{#1}} \newcommand{\emat}{\end{array}\right]} \newcommand{\coll}[2]{\bmat{r} #1 \\ #2 \emat} \newcommand{\ccoll}[2]{\bmat{c} #1 \\ #2 \emat} \newcommand{\colll}[3]{\bmat{r} #1 \\ #2 \\ #3 \emat} \newcommand{\ccolll}[3]{\bmat{c} #1 \\ #2 \\ #3 \emat} \newcommand{\collll}[4]{\bmat{r} #1 \\ #2 \\ #3 \\ #4 \emat} \newcommand{\ccollll}[4]{\bmat{c} #1 \\ #2 \\ #3 \\ #4 \emat} \newcommand{\colllll}[5]{\bmat{r} #1 \\ #2 \\ #3 \\ #4 \\ #5 \emat} \newcommand{\ccolllll}[5]{\bmat{c} #1 \\ #2 \\ #3 \\ #4 \\ #5 \emat} \newcommand{\red}[1]{{\color{red}#1}} \newcommand{\blue}[1]{{\color{blue}#1}} \newcommand{\lra}[1]{\mbox{$\xrightarrow{#1}$}} \newcommand{\rank}{\textrm{rank}} \newcommand{\row}{\textrm{row}} \newcommand{\col}{\textrm{col}} \newcommand{\null}{\textrm{null}} \newcommand{\nullity}{\textrm{nullity}} \renewcommand{\Re}{\operatorname{Re}} \renewcommand{\Im}{\operatorname{Im}} \renewcommand{\Arg}{\operatorname{Arg}} \renewcommand{\arg}{\operatorname{arg}} \newcommand{\adj}{\textrm{adj}} \newcommand{\mystack}[2]{\genfrac{}{}{0}{0}{#1}{#2}} \newcommand{\mystackthree}[3]{\mystack{\mystack{#1}{#2}}{#3}} \newcommand{\qimplies}{\quad\implies\quad} \newcommand{\qtext}[1]{\quad\text{#1}\quad} \newcommand{\qqtext}[1]{\qquad\text{#1}\qquad} \newcommand{\smalltext}[1]{{\small\text{#1}}} \newcommand{\svec}[1]{\,\vec{#1}} \newcommand{\querytext}[1]{\toggle{\blue{\text{?}}\vphantom{\text{#1}}}{\text{#1}}\endtoggle} \newcommand{\query}[1]{\toggle{\blue{\text{?}}\vphantom{#1}}{#1}\endtoggle} \newcommand{\smallquery}[1]{\toggle{\blue{\text{?}}}{#1}\endtoggle} \newcommand{\bv}{\mathbf{v}} \newcommand{\cyc}[2]{\cssId{#1}{\style{visibility:hidden}{#2}}} $

Announcements:

Today we finish Sections 3.1 and 3.2. Over reading week, read Section 3.3. Work through suggested exercises.

Homework 5 is on WeBWorK and is due today. The next homework is due in two weeks and will be available sometime during reading week.

Math Help Centre: M-F 12:30-5:30 in PAB48/49 and online 6pm-8pm. Closed during reading week.

My next office hour is today 2:30-3:20 in MC130. No office hours during reading week.

Partial review of Lectures 13 and 14:

Matrix multiplication

Definition: If $A$ is $m \times \red{n}$ and $B$ is $\red{n} \times r$, then the product $C = AB$ is the $m \times r$ matrix whose $i,j$ entry is $$ \kern-6ex \begin{aligned} c_{ij} &= a_{i\red{1}} b_{\red{1}j} + a_{i\red{2}} b_{\red{2}j} + \cdots + a_{i\red{n}} b_{\red{n}j} = \sum_{\red{k}=1}^{n} a_{i\red{k}} b_{\red{k}j} \\ &= \row_i(A) \cdot \col_j(B) . \end{aligned} $$

To remember the shape of $AB$: $$ \mystack{A}{m \times n} \ \ \mystack{B}{n \times r} \mystack{=}{\strut} \mystack{AB}{m \times r} $$

For the most part, matrix multiplication behaves like multiplication of real numbers, but there are several differences:

We can have $A \neq O$ but $A^k = O$ for some $k > 1$.
We can have $B \neq \pm I$, but $B^4 = I$.
We can have $AB \neq BA$.

But most expected properties do hold:

Theorem 3.3: Let $A$, $B$ and $C$ be matrices of the appropriate sizes, and let $k$ be a scalar. Then:

(a) $A(BC) = (AB)C$(associativity)
(b) $A(B + C) = AB + AC$(left distributivity)
(c) $(A+B)C = AC + BC$(right distributivity)
(d) $k(AB) = (kA)B = A(kB)$(no cool name)
(e) $I_m A = A = A I_n$ if $A$ is $m \times n$ (identity)

New material: Sections 3.1 and 3.2 continued.

Partitioned Matrices

Sometimes it is natural to view a matrix as partitioned into blocks. For example: $$ \kern-8ex A = \bmat{rrrrr} 1 & 0 & 0 & 2 & 1 \\ 0 & 1 & 0 & 1 & 3 \\ 0 & 0 & 1 & 4 & 0 \\ 0 & 0 & 0 & 1 & 7 \\ 0 & 0 & 0 & 7 & 2 \emat = \bmat{rrr|rr} 1 & 0 & 0 & 2 & 1 \\ 0 & 1 & 0 & 1 & 3 \\ 0 & 0 & 1 & 4 & 0 \\ \hline 0 & 0 & 0 & 1 & 7 \\ 0 & 0 & 0 & 7 & 2 \emat = \bmat{cc} I & D \\ O & C \emat $$ This can make matrix multiplication much easier when there are blocks that are zero or an identity matrix. For example, if $$ B = \bmat{rr} 0 & 0 \\ 0 & 0 \\ 0 & 0 \\ \hline 1 & 0 \\ 0 & 1 \emat = \bmat{c} O \\ I \emat $$

then

$$ \kern-8ex AB = \bmat{cc} I & D \\ O & C \emat \, \bmat{c} O \\ I \emat = \bmat{c} IO + DI \\ OO + CI \emat = \bmat{c} D \\ C \emat = \bmat{rr} 2 & 1 \\ 1 & 3 \\ 4 & 0 \\ \hline 1 & 7 \\ 7 & 2 \emat $$ You pretend that the submatrices are numbers and do matrix multiplication. As long as all of the sizes match up, this works. But keep the left/right order straight!

See Example 3.12 for a larger, more complicated worked example.

The most common (and important) cases are when one or both of the matrices are partitioned into rows or columns. For example, if $A$ is $m \times n$ and $B$ is $n \times r$, and we partition $B$ into its columns as $B = [ \, \vb_1 \mid \vb_2 \mid \cdots \mid \vb_r ]$, then we have: $$ \kern-6ex AB = A[ \, \vb_1 \mid \vb_2 \mid \cdots \mid \vb_r ] = [\, A\vb_1 \mid A\vb_2 \mid \cdots \mid A\vb_r ] , $$ where we think of $A$ and the $\vb_i$'s as scalars. Note that $\vb_i$ is $n \times 1$ and $A \vb_i$ is $m \times 1$. The $i$th column of $AB$ consists of the dot products of the rows of $A$ with the $i$th column $\vb_i$ of $B$.

Example 15-1: $2 \times 3$ times $3 \times 2$. (Board.)

Note that each column of $AB$ is a linear combination of the columns of $A$.

Similarly, if we partition $A$ into rows, we can compute $$ AB = \bmat{c} A_1 \\ \hline A_2 \\ \hline \vdots \\ \hline A_m \emat B = \bmat{c} A_1 B \\ \hline A_2 B \\ \hline \vdots \\ \hline A_m B \emat $$

Same example on board.

If we partition $A$ into rows and $B$ into columns, we get $$ \kern-8ex AB = \bmat{c} A_1 \\ \hline A_2 \\ \hline \vdots \\ \hline A_m \emat [ \, \vb_1 \mid \vb_2 \mid \cdots \mid \vb_r ] = \bmat{ccc} A_1 \vb_1 & \cdots & A_1 \vb_r \\ \vdots & & \vdots \\ A_m \vb_1 & \cdots & A_m \vb_r \emat $$ which is just the usual description of $AB$, where the $ij$ entry is the dot product of the $i$th row of $A$ with the $j$th column of $B$!

(Outer products and Example 3.11 not covered.)

 

The Transpose and Symmetric Matrices

Here's another operation on matrices, which has no analog for real numbers:

Definition: The transpose of an $m \times n$ matrix $A$ is the $n \times m$ matrix $A^T$ whose $ij$ entry is the $ji$ entry of $A$.

Example 3.14: The transposes of $$ \kern-8ex A = \bmat{rrr} 1 & 3 & 2 \\ 5 & 0 & 1 \emat, \quad B = \bmat{rr} a & b \\ c & d \emat , \quad \text{and} \quad C = \bmat{rrr} 5 & -1 & 2 \emat $$ are $$ \kern-8ex A^T = \bmat{rr} 1 & 5 \\ 3 & 0 \\ 2 & 1 \emat, \quad B^T = \bmat{rr} a & c \\ b & d \emat , \quad \text{and} \quad C^T = \bmat{r} 5 \\ -1 \\ 2 \emat . $$ Note that the columns and rows get interchanged.

One use of the transpose is to convert between row vectors and column vectors. In particular, we can use this to express the dot product in terms of matrix multiplication. If $$ \vu = \bmat{c} u_1 \\ u_2 \\ \vdots \\ u_n \emat \qquad\text{and}\qquad \vv = \bmat{c} v_1 \\ v_2 \\ \vdots \\ v_n \emat $$ then $$ \kern-6ex \vu^T \vv = [ u_1 \, u_2 \, \cdots \, u_n ] \bmat{c} v_1 \\ v_2 \\ \vdots \\ v_n \emat = u_1 v_1 + \cdots + u_n v_n = \vu \cdot \vv $$

Properties of the transpose

Theorem 3.4: Let $A$ and $B$ be matrices of the appropriate sizes, and let $k$ be a scalar. Then:

(a) $(A^T)^T = A$ (b) $(A+B)^T = A^T + B^T$
(c) $(kA)^T = k(A^T)$       (d) $(AB)^T = B^T A^T$   !
(e) $(A^r)^T = (A^T)^r$ for all nonnegative integers $r$

(a), (b) and (c) are easy to see. (d) is more of a surprise, so it is worth explaining:

Proof of (d): Suppose $A$ is $m \times n$ and $B$ is $n \times r$. Then both of $(AB)^T$ and $B^T A^T$ are $r \times m$. We have to check that the entries are equal: $$ \kern-8ex \begin{aligned}{} [(AB)^T]_{ij} &= (AB)_{ji} = \row_j(A) \cdot \col_i(B) = \col_j(A^T) \cdot \row_i(B^T) \\ &= \row_i(B^T) \cdot \col_j(A^T) = [(B^T)(A^T)]_{ij} . \qquad\Box %\tag*{∎} \end{aligned} $$

Example 15-2: Board.

Note that (b) and (d) extend to several matrices. For example: $$ \kern-8ex (A + B + C)^T = ((A+B) + C)^T = (A+B)^T + C^T = A^T + B^T + C^T $$ and $$ \kern-6ex (ABC)^T = ((AB)C)^T = C^T (AB)^T = C^T B^T A^T $$ In particular, (e) follows: $(A^r)^T = (A^T)^r$.

Symmetric matrices

Definition: A square matrix $A$ is symmetric if $A^T = A$. That is, $A_{ij} = A_{ji}$ for every $i$ and $j$.

Example: These matrices are symmetric: $$ \bmat{rr} 1 & 2 \\ 2 & 3 \emat \quad \bmat{rrr} 1 & 2 & 3 \\ 2 & 4 & 5 \\ 3 & 5 & 6 \emat \quad \bmat{rr} 0 & 0 \\ 0 & 0 \emat $$

Example: These matrices are not symmetric: $$ \bmat{rr} 2 & 1 \\ 3 & 2 \emat \quad \bmat{rrr} 1 & 2 & 1 \\ 5 & 4 & 2 \\ 1 & 5 & 1 \emat \quad \bmat{rrr} 0 & 0 & 0 \\ 0 & 0 & 0 \emat $$

There are two ways to get a symmetric matrix from a non-symmetric matrix:

1. If $A$ is square, then $A + A^T$ is symmetric. This is because $$ \kern-5ex (A + A^T)^T = A^T + (A^T)^T = A^T + A = A + A^T . $$ Example 15-3: Board.

2. And if $B$ is any matrix, then $B^T B$ is symmetric. This is because $$ \kern-5ex (B^T B)^T = B^T (B^T)^T = B^T B $$ The same kind of argument shows that $B B^T$ is symmetric.

Example 15-4: Board.

True/false: If $A$ is symmetric, so is $A^2$.

True/false: If $A$ and $B$ are symmetric matrices of the same size, then $AB$ is symmetric.

Challenge problems from before

Question: Find a $3 \times 3$ matrix $A$ such that $A^2 \neq O$ but $A^3 = O$.

Question: Find a $2 \times 2$ matrix $A$ such that $A \neq I_2$ but $A^3 = I_2$.

Similarly, for each $n$ you can find a matrix such that $A^n = I$ but no lower power of $A$ is the identity.

Don't forget that I have an office hour later today in MC130.