(4): Suppose is a zero of . Thus is a factor of so then:
Plugging in :
since is a minimal polynomial. But then:
Thus , so there's some vector such that . As such, then:
So is an eigenvalue of via . This shows . For suppose is an eigenvalue. Then for some . Then:
Plug in :
Thus .
☐
This implies that if you have the characteristic polynomial , then you kinda know what the minimal polynomial should look like. Suppose:
Then is:
where for all .
Example
Suppose where:
Let , and compose the basis . Then:
Thi is an UT matrix so with and with . We could further find the basis such that we have block-diagonal matrices. However, here:
But:
where . But which one? We calculate for the case:
We could find the matrix representation and find that , so then is not the minimal polynomial. Thus, the characteristic polynomial in this case is the minimal polynomial.
8.D: Jordan Decomposition
While Axler doesn't use a proof that uses quotient spaces, we'll use them here since we are good at them.
Recall that for any operator that there is some such that:
where if is nilpotent then:
We know that we can find a basis such that is "extra upper triangular" where we have upper triangular sub-matrices in the matrix.
The question here is: can we do better? Namely, can we get more zeroes in ?
Goal
We want to produce a basis such that has many zeros.
An Example
Let's do an example to get our feet wet. Suppose with . Further, we suppose that:
So since have non-trivial nullspaces, we can choose vectors from each of them:
Choose such that . Note that
Now consider . They are vectors which are in plus something in . But we know that so then is a valid basis for it.
Now consider , is a basis for .
Consider , then is a start, but we need to add 2 more vectors. Thus add in this list. So we have .
Consider . It's dimension is 4 so have where is added on.
In general:
Choose such that for .
Find a basis for where .
Repeat for growing values of , moving over the difference in dimensions of our nullspace chain.