Suppose and there is a basis denoted , for which is U.T. Then is invertible iff every diagonal entry in is non-zero.
Proof
Consider . We have:
Assume entry of this matrix is 0:
The bottom right matrix is still the zero matrix. Notice that is LI while they are in the . Which is a contradiction. See the proof at Lecture 24 - Finishing Eigenstuff#^d13947 for a more specific example.
Now consider . Suppose the diagonal entries of are all non-zero:
where is non-zero for all . We need to show is invertible. Since is from to , then we only need to prove either injectivity or surjectivity.
We will show is surjective. The first column says that:
Since then:
So then .
Repeat for the second column. Here:
Thus since the right vector is already in the range of .
Repeat this process, up to . Hence, all , so then showing is surjective.
☐
We're now going to use this theorem to prove a bigger theorem:
Theorem
Suppose that has an U.T. matrix representation with respect to the basis , denoted . Then the eigenvalues of are precisely the diagonal entries of .
You can't use 'macro parameter character #' in math modeSo since $M(D,\beta')$ has no zero then $D$ is invertible (there's no zeros on the diagonal). Furthermore, $\lambda = 1$ and $\lambda = -1$ are the eigenvalues of $D$. # Diagonal Matrix Representations Recall that: > [!definition] Diagonal Matrix > $A_{n \times n}$ is called *diagonal* if all entries off the diagonal are 0's. For instance, $\text{diag}(1,2,3) = \begin{bmatrix}1 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 3\end{bmatrix}$. The zero matrix is a diagonal matrix. Note that diagonal matrices are always U.T., so then all the properties we got from them apply, such as reading the eigenvalues off the diagonal. But we want to ask **when** a diagonal matrix $M(T)$ for $T \in \mathcal{L}(V)$ exists? Suppose: