Continue reading Section 3.3. But we aren't covering elementary matrices. Work through recommended homework questions.
Tutorials: Quiz 3 this week covers to the end of Section 3.2.
Solutions to the midterm are available from the course home page. Class average was 31/40 = 77.5%. Great work! But keep in mind that the material naturally gets much more difficult.
Office hour: today, 12:30-1:30, MC103B.
Help Centers: Monday-Friday 2:30-6:30 in MC 106.
For example, if $B$ is partitioned into columns as $B = [ \, \vb_1 \mid \vb_2 \mid \cdots \mid \vb_r ]$, then we have: $$ AB = [\, A\vb_1 \mid A\vb_2 \mid \cdots \mid A\vb_r ] . $$
Also, remember that if $A$ is partitioned into columns as $A = [ \, \va_1 \mid \va_2 \mid \cdots \mid \va_n ]$, then $$ A \vx = x_1 \va_1 + \cdots + x_n \va_n , $$ a linear combination of the columns of $A$.
After adding, subtracting and multiplying, what is missing?
We could do the same thing for a matrix equation $A \vx = \vb$ if we
could find a matrix $A'$ such that $A' A = I$. Then:
$$
A \vx = \vb
\qimplies
A' A \vx = A' \vb
\qimplies
\vx = A' \vb .
$$
So, if $A \vx = \vb$ has a solution, then it must be $A' \vb$.
On the other hand, let's check whether $A' \vb$ is a solution:
$$
A ( A' \vb ) = A A' \vb = \text{??} = \vb ,
$$
where the last step only works if we know that $A A' = I$ as well.
So we require both conditions:
Definition: An inverse of an $n \times n$ matrix $A$ is an $n \times n$ matrix $A'$ such that $$ A A' = I \qtext{and} A' A = I . $$ If such an $A'$ exists, we say that $A$ is invertible.
(One could consider this for $A$ not square, but no such $A'$ would ever exist.)
Example: If $A = \bmat{rr} 1 & 2 \\ 3 & 7 \emat$, then $A' = \bmat{rr} 7 & -2 \\ -3 & 1 \emat$ is an inverse of $A$. (On whiteboard.)
Example: Does $O = \bmat{rr} 0 & 0 \\ 0 & 0 \emat$ have an inverse?
Example: Does $B = \bmat{rr} -1 & 3 \\ 2 & -6 \emat$ have an inverse?
We'll learn next class how to determine whether a matrix has an inverse, and how to find it when it does. Today we'll discuss some general properties, and also $2 \times 2$ matrices.
Theorem 3.6: If $A$ is an invertible matrix, then its inverse is unique.
Proof: Suppose that $A'$ and $A''$ are both inverses of $A$. We'll show they must be equal: $$ \kern-4ex A' = A' I = A' (A A'') = (A' A) A'' = I A'' = A'' . \qquad\Box $$ Because of this, we write $A^{-1}$ for the inverse of $A$, when $A$ is invertible. We do not write $\frac{1}{A}$.
Theorem 3.7: If $A$ is an invertible matrix $n \times n$ matrix, then the system $A \vx = \vb$ has the unique solution $\vx = A^{-1} \vb$ for any $\vb$ in $\R^n$.
This follows from the argument we gave earlier.
Example on whiteboard: Solve the systems $$ \kern-4ex \begin{aligned} \ph x + 2 y &= 3 \\ 3 x + 7 y &= 4 \end{aligned} \qqtext{and} \begin{aligned} \ph x + 2 y &= \ph \, 2 \\ 3 x + 7 y &= -1 \end{aligned} $$
Remark: This is not in general an efficient way to solve a system. Using row reduction is usually faster. And row reduction works when the coefficient matrix is not square or not invertible. The above method can be useful if you need to solve a lot of systems with the same $A$ but varying $\vb$.
Theorem: The matrix $A = \bmat{cc} a & b \\ c & d \emat$ is invertible if and only if $ad - bc \neq 0$. When this is the case, $$ A^{-1} = \frac{1}{ad-bc} \, \bmat{rr} \red{d} & \red{-}b \\ \red{-}c & \red{a} \emat . $$
We call $ad-bc$ the determinant of $A$, and write it $\det A$.
It determines whether or not $A$ is invertible, and also shows
up in the formula for $A^{-1}$.
Example: The determinant of $A = \bmat{rr} 1 & 2 \\ 3 & 7 \emat$ is $\det A = 1 \cdot 7 - 2 \cdot 3 = 1$, so $$A^{-1} = \frac{1}{1} \bmat{rr} 7 & -2 \\ -3 & 1 \emat ,$$ as we saw before.
Example: The determinant of $B = \bmat{rr} -1 & 3 \\ 2 & -6 \emat$ is $\det B = (-1)(-6) - 3 \cdot 2 = 0$, so $B$ is not invertible (as we saw).
Why the formula works: Show on whiteboard that $$ \bmat{cc} a & b \\ c & d \emat \bmat{rr} d & -b \\ -c & a \emat = \det A \bmat{cc} 1 & 0 \\ 0 & 1 \emat . $$ Therefore, $$ \kern-8ex \bmat{cc} a & b \\ c & d \emat \left( \frac{1}{\det A} \bmat{rr} d & -b \\ -c & a \emat \right) = \frac{\det A}{\det A} \bmat{cc} 1 & 0 \\ 0 & 1 \emat = \bmat{cc} 1 & 0 \\ 0 & 1 \emat . $$ A similar argument works for the other order of multiplication.
Why $A$ is not invertible when $ad-bc = 0$: We have $ad=bc$. If $b$ and $d$ are non-zero, then $a/b = c/d$. This means that the columns of $A$ are parallel, so we can't solve every system $A \vx = \vb$. (A solution only exists if $\vb$ is in the span of the columns of $A$, which is a line.) So $A$ is not invertible.
Similarly, if $b$ or $d$ is zero, we can still show that the columns of $A$ are parallel.
Theorem 3.9: Assume $A$ and $B$ are invertible matrices of the same size. Then:
a. $A^{-1}$ is invertible and
$$(A^{-1})^{-1} = \query{A} $$
b. If $c$ is a non-zero scalar, then $cA$ is invertible and
$$(cA)^{-1} = \query{\frac{1}{c} A^{-1}} $$
c. $AB$ is invertible and
$$(AB)^{-1} = \query{B^{-1} A^{-1} \quad\text{(socks and shoes rule)}} $$
d. $A^T$ is invertible and
$$ (A^T)^{-1} = \query{(A^{-1})^T} $$
e. $A^n$ is invertible for all nonnegative integers $n$ and
$$ (A^n)^{-1} = \query{(A^{-1})^n} $$
To verify these, in every case you just check that the matrix shown is an inverse. All 5 done on the whiteboard.
Remark: Property (c) is the most important, and generalizes to more than two matrices, e.g. $(ABC)^{-1} = C^{-1} B^{-1} A^{-1}$.
Remark: For $n$ a positive integer, we define $A^{-n}$ to be $(A^{-1})^n = (A^n)^{-1}$. Then $A^n A^{-n} = I = A^0$, and more generally $A^r A^s = A^{r+s}$ for all integers $r$ and $s$.
Remark: There is no formula for $(A+B)^{-1}$. In fact, $A+B$ might not be invertible, even if $A$ and $B$ are.
We can use these properties to solve a matrix equation for an unknown matrix. Assume that $A$, $B$ and $X$ are invertible matrices of the same size.
Example: Solve $AXB^2 = B A B^{-1}$ for $X$.
Solution: $$ \begin{aligned} AXB^2 = B A B^{-1} &\qimplies A^{-1} (A X B^2) B^{-2} = A^{-1} (B A B^{-1}) B^{-2} \\ &\qimplies X = A^{-1} B A B^{-3} \end{aligned} $$
Example: Solve $(AXB)^{-1} = BA$ for $X$.
Solution: $$ \begin{aligned} (AXB)^{-1} = BA &\qimplies ((A X B)^{-1})^{-1} = (BA)^{-1} \\ &\qimplies A X B = A^{-1} B^{-1} \\ &\qimplies A^{-1} (A X B) B^{-1} = A^{-1} (A^{-1} B^{-1}) B^{-1} \\ &\qimplies X = A^{-2} B^{-2} \end{aligned} $$
Can you find a $2 \times 3$ matrix $A$ and a $3 \times 2$ matrix $A'$ such that $A A' = I_2$ and $A' A = I_3$?