HW 8 - Operator Decomposition, Characteristic and Minimal Polynomials
8.B: Operator Decomposition
6
Question
Define by:
Find a square root of .
Proof
As a suggestion, use the Taylor series for as motivation (because here seems to be nilpotent):
Substitute , we assume and will verify:
First, let's show that is nilpotent by showing that . Notice that using as the standard basis:
Thus has the form of a nilpotent operator, so it's nilpotent, thus . Notice that we can verify our factor for works by just squaring it:
Showing it's a valid square root.
☐
7
Question
Suppose is a complex vector space. Prove that every invertible operator on has a cube root.
Proof
Let be an invertible operator, so . We'll actually prove that every invertible operator on has some -th root instead (here , so which proves the lemma for .
Following the proof structure showing the existence of square roots, let be the distinct eigenvalues of . For each there exists a nilpotent operator such that via the description of operators on complex vector spaces. Because is invertible then for all , so then:
for each .
Cool. Now we need to prove an important lemma:
Lemma
Suppose is nilpotent. Then has a -th root, where .
Proof
Say . Consider the choice:
The Taylor Series of is the same sum as above, using . If you believe the multiplication of the power series works out the same, then they will equal.
☐
As such then implies that since is nilpotent, then has a cube root using the lemma. Therefore we obtain a cube root of .
A typical vector can be written as:
where each . Define by:
Where is a cube root of because:
Thus is a valid cube root for .
☐
8
Question
Suppose and are eigenvalues of . Let . Prove that .
Proof
Let's show the intersection is trivial, so suppose . So then and s.t. . Apply to both sides to get:
because null spaces have to stop growing and eventually encompass the whole . Notice we have to consider that when so that our power of is over , but notice this is true since are eigenvalues of , then and so then as desired.
So then showing the intersection is trivial, so the sum is a direct sum.
Also:
Using a combination of the fact that a direct sum's dimensions add up, as well as the FTOLM.
☐
9
Question
Suppose are block diagonal matrices of the form:
where are the same size for . Show that is a block diagonal matrix of the form:
Proof
First consider the whole matrices. Since are the same size, say , then consider as the vector space of a column vector of that size . We can always choose any basis of and then define in a way such that and . Therefore, .
Now for the "sub-matrices". Let be the size of which is the same as . Consider the list of the first vectors from . and show that the span of these vectors are invariant under (because are block diagonal, then all are UT, showing invariance)
Similarly, the span of the next vectors in are also invariant. Continuing, we see that there are distinct lists of consecutive vectors with no intersections.
Let denote such spans, namely . Clearly and likewise for each , so as desired.
☐
10
Question
Suppose and . Prove that there exist such that where is diagonalizable and is nilpotent and .
Proof
Let be the distinct eigenvalues of . Using the description of operators on complex vector spaces, then there exists a nilpotent operator such that for all . Notice though that is a diagonalizable operator since for some , so then there's a basis spanning . Since all the 's are generalized eigenvectors, then we can create the 1-dimensional subspaces where for each . Notice that these are invariant under clearly, so then by the conditions equivalent to diagonalizability, then is diagonalizable. This makes even more sense since:
As such, then we've done the construction we want. Now we show that for all . But that's easy since:
as desired.
Now we make . For all we can set them as sums of vectors from each generalized eigenspace:
where each . Create the operator defined by:
Each which has its own basis . Notice that:
So each is an eigenvector of our subspace with eigenvalue , so as a result each is diagonalizable where . As a result, for each then:
So if we use a basis (in the same order) implies that:
which is a diagonal matrix.
Similarly we define by:
Clearly is nilpotent since each is nilpotent, so just take the max number for the power of the highest-degree nilpotent operator, and take as a result:
Now we just show but that's easy since:
Thus .
☐
11
Question
Suppose and . Prove that for every basis of with respect to which has an upper triangular matrix, the number of times that appears on the diagonal of the matrix of equals the multiplicity of as an eigenvalue of .
Proof
Let be a basis where is an upper triangular matrix. As such, we know that the 's have to appear on the diagonal of .
Let be arbitrary, and say it appears on the -th diagonal entry of . Then:
for . Thus:
We suppose that we started looking at the last entry of the same on the diagonal. Say that is the number of times it appears on that diagonal. We want to show that as a result. Note that in a similar vein of logic:
for some . We can repeat this process over an over again, and show a few important things:
Now are these span strict subsets or not? Well we just have to show that the are LI, which they are because they come from a subspace of the basis . Consequently, we can treat each subspace as a vector space, and thus since the subspaces are strict then the dimensions have to increase. Therefore:
Thus then but we know that we only can go entries, so clearly so . But since all are eigenvectors, then so then clearly .
Clearly because we have diagonal entries and each one adds at least 1 to the dimension of each generalized eigenspace for all .
Thus for all completing the proof.
☐
8.C: Characteristic & Minimal Polynomials
1
Question
Suppose is such that the eigenvalues of are 3,5, and 8. Prove that .
Proof
Notice that in this case. Since we have 3 distinct eigenvalues, then at least 1 generalized eigenspace is of dimension 2, while the others are 1. For instance, if then . As such, then the characteristic polynomial for would be given by:
where at least one while the others are equal to 1. Notice then that by the Cayley-Hamilton Theorem that . Furthermore, we can increase the degree of such that we have:
where contains the other factors (which we won't care about here, without loss of generality). If we show that here then we are done:
☐
2
Question
Suppose is a complex vector space. Suppose is such that are eigenvalues of and has no other eigenvalues. Prove that where .
Proof
This takes a similar process to 8.C.1 done earlier. Namely, we know that since are the only eigenvalues of then:
Thus then the characteristic polynomial of would be:
where we require that since and then as a result.
First note that we must have each . Why? Because if then so is not an eigenvalue which is a contradiction to the given. As such, then at minimum for example and vice versa (without loss of generality). Clearly then for similar reasoning. Therefore, consider the polynomial defined by:
Clearly since then there is some where:
Now if we show that then we are done. Notice that by the Cayley-Hamilton Theorem. Thus:
☐
3
Question
Give an example of an operator on whose characteristic polynomial equals .
Proof
Consider given by:
Notice that clearly with since (using the standard basis ). Similarly with since . Thus the characteristic polynomial for is:
☐
4
Question
Give an example of an operator on whose characteristic polynomial equals and whose minimal polynomial equals .
Proof
Consider defined by its matrix representation:
Notice that using the standard basis , that is an eigenvector with and are eigenvectors with given by the diagonals of the matrix. As such, then the characteristic polynomial is:
We'll show that the minimal polynomial is by showing that while :
Thus . Further:
Therefore .
☐
5
Question
Give an example of an operator on whose characteristic and minimal polynomials both equal .
Proof
Choose where:
Notice that the diagonal tells the 's (since it's UT), so then the characteristic polynomial comes from these 's:
where notice we have a multiplicity of 2 on the since from the standard basis are eigenvectors.
To show that the minimal polynomial is the characteristic, we show that :
So since we can only possible reduce the power of 2 from the characteristic polynomial, then we have no other options. The characteristic polynomial must be the minimal polynomial in this case.
☐
6
Question
Give an example of an operator on whose characteristic polynomial equals while the minimal polynomial equals .
Proof
Do a similar process to the last problem.
Choose where:
Notice that the diagonal tells the 's (since it's UT), so then the characteristic polynomial comes from these 's:
where notice we have a multiplicity of 2 on the since from the standard basis are eigenvectors.
To show that the minimal polynomial is indeed , we show that :
Thus the minimal polynomial in this case is (notice we cannot reduce any more powers, so this has to be the minimal polynomial).
☐
7
Question
Suppose is a complex vector space. Suppose is such that . Prove that the characteristic polynomial of is where and .
Proof
Since then:
But notice an important thing. First, so then it's either even or odd. If it's even then:
If instead it's odd then:
In both cases, we can repeatedly apply this to (just treat the new integer power as ) and keep reducing it down further and further. It must stop once we hit the lowest element in which would be so then as a result.
Therefore:
This shows that as we wanted in the theorem.
But also the FTOLM implies that:
Implying that . Notice that we now have the right information to build our characteristic polynomial, since as we wanted, so there's going to be a factor of in the characteristic polynomial. Further, we found that so then we expect a factor of as well. Putting them together, since we have no other eigenvalues and thus have no other possible factors:
as desired.
☐
8
Question
Suppose . Prove that is invertible iff the constant term in the minimal polynomial of is non-zero.
Proof
Let's prove the negated version of this, that is not invertible iff the constant term in is zero.
The constant term for is zero () iff is a factor of (we can show this via contradiction. If instead 0 wasn't a factor then would equal the constant term which would have to be non-zero which contradicts 0 not being a factor) iff can factor out the as a multiple:
where is the same polynomial of with the factor of 0 removed. This is iff is an eigenvalue of iff is not invertible.
☐
9
Question
Suppose has minimal polynomial . Find the minimal polynomial of .
Proof
We are given shown above. Since then:
Since is invertible then apply a total of 5 times to both sides:
So since having implies that is a multiple of the minimal polynomial of then have the left side be , which is a multiple of the minimal polynomial. Namely:
Now all we have to do to get the actual minimal polynomial is to make the polynomial monic. Notice that we can add factors (otherwise it's not the minimal polynomial) nor remove factors (otherwise it wasn't the minimal polynomial as it was missing factors) so all we can do is multiply by a non-zero constant. As a result, to make it monic let's just divide by the coefficient of the highest term: