This is a false statement. Notice here that if we have and then both but since it's degree is . Hence, isn't closed under vector addition.
3
Theorem
The subset is a subspace of .
This is also a false statement. Let and . Then , showing that isn't closed under vector addition.
4
Theorem
Suppose are positive integers where , and suppose . Then there is a polynomial with such that and such that has no other zeros.
Proof
Consider if . Then clearly there's a polynomial where:
coming from desiring . This step comes from the factorization of a polynomial over . Here as desired, and clearly doesn't have any other zeros.
Now consider if . Then . We need to add zeroes (to increase the degree) without adding "new" zeros. Hence use:
Hence we have found a polynomial where . As required.
☐
6
Theorem
Suppose has degree . Then has distinct zeros iff and it's derivative have no zeroes in common.
Proof
Consider , so suppose has distinct zeros . Then:
Using the division algorithm, we can show that for any zero above that:
and using the derivative on both sides:
Thus, notice that cannot be shared between and , since the above expression shows any zero from must give a remainder polynomial for unless , which would suggest and thus has 0 distinct zeroes and thus is its own contradiction. Therefore, and have different zeroes.
Now . Suppose that and have no zeroes in common. Assume for contradiction that can't have distinct zeroes. Since is a degree polynomial, then we must have at least zeroes, so therefore at least one zero must be repeated. Namely:
where since is repeated, it's must occur at least twice. Deriving it:
So then clearly both share the same zero , which contradicts our supposition from the beginning. Hence, our assumption was wrong, so must have distinct zeroes.
☐
7
Theorem
Every polynomial of odd degree with real coefficients has a real zero.
Proof
Let be and arbitrary odd degree real-coefficient polynomial. Assume for contradiction that only contains complex zeros, ie:
where it is assumed that is odd. Since any polynomial with real coefficients has pairs of zeroes then we can, without loss of generality, say that so that:
But look! Since we have an odd number of 's, pairing them up this way gives one last that has no conjugate pair, which contradicts all polynomial zeros having their conjugate pair being a zero. Hence, must contain a non-complex zero, namely a real one.
☐
Chapter 5.A: Invariants (Finish)
1
Suppose and is a subspace of .
a
Theorem
If then is invariant under
Proof
Suppose . Let be arbitrary. We'll need to show that as well.
Notice that since , then since is a subset of the nullspace of then , so then . Hence, is an invariant under .
☐
b
Theorem
If then is invariant under .
Proof
Let be arbitrary. We'll need to show that . Note that since then as required.
☐
2
Theorem
Suppose are such that . Then is invariant under .
Proof
Let be arbitrary. We'll need to show that , meaning that . Since then . That means that:
since . Hence, so then is invariant under .
☐
3
Theorem
Suppose are such that . Then is invariant under .
Proof
Let be arbitrary. We'll need to show that . Since then there is some vector such that . Then notice that:
so then since is just some vector then so then clearly . Therefore, then is invariant under .
☐
4
Theorem
Suppose that and are subspaces of invariant under . Then is invariant under .
Proof
Let be arbitrary. We'll need to show that .
Since then:
where all for . Since is a linear transformation then:
notice that since is is an invariant subspace of under . In general, all for similar reasoning so then clearly:
so then is invariant under .
☐
5
Theorem
Suppose . Then the intersection of every collection of subspaces of invariant under is invariant under .
Proof
Let be in the intersection of every collection of subspaces of invariant under . Namely, where each is invariant under . Let be arbitrary. We want to show that consequently.
Since then for all we have that . Since all are invariant under , then for all . Therefore, we have it that , showing is invariant under .
☐
8
Define by:
The eigenvalues and eigenvectors are found as follows. Let be an eigenvalue for . Then:
thus then:
so since as (notice if then that makes ) then so then the eigenvalues are .
Plugging back in to get our eigenvectors:
Solve for for both cases:
so have associated with , and then:
so have associated with .
9
Define by:
then we try to find the 's for this transformation:
equate coordinates:
So we have 3 equations and we have 3 unknowns. An easy option is if . Then:
Hence the vector is associated with eigenvalue .
If instead we consider the other obvious one then:
Hence the vector is associated with the eigenvalue .
Notice that these are the only eigenvalues here. We could turn the system of equations into a matrix:
and solve using the characteristic polynomial to show this. But alternatively, we can see that the first equation will never be "reduced" to a true statement for any , so then we only had the two obvious choices above.
11
Define by . We'll find all eigenvalues and vectors for .
We need to find when:
simplifying:
equating coefficients yields a system of equations:
Notice that if then we get that all for meaning which is a constant function. Hence all constant polynomials have as an eigenvalue.
If instead we have then we have the simplified case looking at the last equation that . Hence, then so then . Repeating, we get that so then is the zero function which isn't an eigenvector. Hence there's no eigenvectors for any .
Overall, the only eigenvalue is of eigenvector for any .
12
Define by:
for all . We'll find all eigenvalues and vectors of .
simplifies to:
We can equate coefficients to get:
But look! Clearly if then that forces all for , while so then which is any constant function. Hence, corresponds to the constant function eigenvector.
If instead then this is how we can construct our eigenvalues. Notice if all for while then we have true statements and have the remaining equation . Since then we have (by dividing) that . Looking at the resulting polynomial, that gives:
Hence, in general any (as defined above) is an eigenvector with as it's eigenvalue.
14
Suppose , where are nonzero subspaces of . Define by for and . Find all eigenvalues and vectors of .
Let be arbitrary. Let and . Consider:
But since we have a direct sum, we can't have for any constant, since if that was the case then which isn't unique. Hence, we must have:
Which only happens if . If then we get:
and thus which is a (possibly) non-zero vector. Hence, any is an eigenvector with eigenvalue .
Similarly, if . Then we get:
so then . Hence, any is an eigenvector with eigenvalue .
15
Suppose and . Suppose specifically is invertible.
a
Theorem
and have the same eigenvalues.
Proof
Let be an arbitrary eigenvalue from , with corresponding eigenvector. Thus:
Because is invertible, then is both injective and surjective. Surjectivity will come into play here, since that means there's some where . Thus:
Thus is an eigenvector for any eigenvalue , for the transformation . Hence, since was arbitrary, it holds for all eigenvalues that is shared between and .
WLOG, the reverse direction holds a similar proof and shows that eigenvalues are shared in similarly.
☐
b
Notice that prior that the eigenvector from are not the same for . Namely, the new eigenvector associated with is for each eigenvector from .
18
We'll show that the operator defined by:
has no eigenvalues. Assume for contradiction that it does, at least one , with some unknown eigenvector . Then:
We can equate coefficients to get:
But look at this. Clearly an option is . If that's the case then so then which isn't a valid eigenvector. So instead consider that . Then from the first equation we have it that , but then from the second equation then , and so on. This means that:
Hence then which again isn't a valid eigenvector. Hence, no cases allow for eigenvectors to arise, so there are no valid .
19
Suppose is a positive integer and is defined by:
so then is the operator whose matrix (w.r.t the standard basis) consists of all 1's. We'll find all eigenvalues and eigenvectors of .
Consider when:
Equating rows of our vector, we have:
Namely, for all . That means that:
so then we have two possibilities. If , then we just require that , so then all vectors is a valid eigenvector with eigenvalue .
If instead we have . Then the RHS of our chain of equations suggests that:
by dividing from all sides. If we get a vector from before, so assume that . If that's the case, then:
Thus:
So the only occurs if , and in that case of then we just get the identity map. Hence, any is an eigenvector of with eigenvalue .
20
Consider defined by:
We'll find all eigenvalues and eigenvectors of :
Equating rows gives:
If then we get that so then we have is an eigenvector with eigenvalue .
If instead then notice that:
Notice in general that we can show that . We showed our base case, so let our inductive hypothesis reign for , and we'll show the -th case:
showing the -th case. Notice if then all giving the zero vector which is invalid. So suppose . Then for any we can create an eigenvector defined as:
then this eigenvector has an associated eigenvalue of . We can verify this:
Which works. As an example, if and then our eigenvector is:
as expected.
21
Suppose is invertible.
a
Theorem
Suppose with . Then is an eigenvalue of iff is an eigenvalue of .
Proof
Consider , so suppose is an eigenvalue of . Then there's some associated eigenvector . Then notice that:
thus is an eigenvalue for .
WLOG, going in the reverse direction shows the part.
☐
b
Theorem
and have the same eigenvectors.
Proof
Let be an eigenvalue for with eigenvector . As shown above, we have the same eigenvalue for . We need to show that the eigenvector associated with that value is still .
Notice that:
So then 's eigenvector for is still as we needed. Since was an arbitrary eigenvalue, then for all eigenvalues for we have sharing eigenvectors. This works in both directions.
☐
23
Theorem
Suppose is finite-dimensional and . Then and have the same eigenvalues.
Proof
Let . Let be an eigenvalue of with corresponding eigenvector . Notice that:
Thus then is an eigenvalue for with the corresponding eigenvector . Since was arbitrary, this holds true for all eigenvalues from to . A similar proof by swapping with shows the reverse direction.
☐
29
Theorem
Suppose and . Then has at most distinct eigenvalues.
Proof
Assume for contradiction that has or more eigenvalues. Hence, at least are all distinct eigenvalues of . But then since then there's a basis for , so then is that same basis.
Notice in the best case that all are eigenvectors, so then are their associated eigenvalues. But we still have , so since this is the best case where we have the maximum number of eigenvectors for then we require that as the other vectors need to be send to the zero vector.
But look! , so they aren't distinct! This contradicts our supposition that they be distinct, so then we must have the opposite of the assumption. Thus has at most distinct eigenvalues.
☐
Chapter 5.B: Eigenvectors and UT Matrices
1
Suppose and there is some positive integer such that .
a
Theorem
is invertible and
Proof
We can just show that:
Thus we've happened to find an invertible transformation as defined above, and showed that it worked. Thus, it's the inverse matrix, showing is invertible.
☐
b
Usually we pull these ideas from properties of the real numbers, or other discrete math. Namely, we know that:
where since we don't have an infinite vector space we need:
where all hence the condition of as if that's true then for all .
2
Theorem
Suppose and . Suppose is an eigenvalue of . Then .
Proof
Let be an eigenvalue of . Then we know that:
where is 's associated eigenvector. Now, assume for contradiction that (all). Notice that:
Notice that if that would be a contradiction as then would be a valid eigenvalue. Hence, say where is non-zero. Then:
In a similar way, if then that would imply that is an eigenvalue which is a contradiction. Hence, say for nonzero . Then:
so then is an eigenvalue is a contradiction. Hence, .
☐
4
Theorem
Suppose and . Then .
Proof
We can show this if and only contains the zero vector. Let be any non-zero vector. Assume for contradiction that and . Then and further there is some vector where . Notice though that:
while:
so then:
Which contradicts being non-zero. Hence is disjoint with so then we get a direct sum. This is also a valid sum since, by the FTOLM:
as and are disjoint.
☐
5
Theorem
Suppose and is invertible. Suppose is a polynomial. Then:
Proof
First, we'll prove a small lemma, namely that .
First, notice that case is satisfied. Suppose the lemma works for all cases under . Consider the -th case:
hence completing the proof for the lemma.
We know that, supposing :
Completing the proof.
☐
7
Theorem
Suppose . Then is an eigenvalue of iff or is an eigenvalue of .
Proof
Consider , so suppose is an eigenvalue of , so then:
for some . Notice that:
where and . Notice that so then:
thus:
now assume for contradiction that and are not eigenvalues of . Then notice that we must have since if it did then would be a valid eigenvalue of . Hence, call this vector . Then:
So then is an eigenvalue, which is also a contradiction. Hence, no matter what we get a contradiction, so then either or are eigenvalues of .
Now consider . Suppose or are an eigenvalue of . WLOG, suppose is an eigenvalue. Then:
for some associated eigenvector . Then:
So then notice that so then is an eigenvalue for .
In a similar way if was the eigenvalue, then:
so then is still an eigenvalue for .
☐
8
We want to find an example of where . This suggest that:
Taking some inspiration from complex number's it's like we want to find the solutions to:
But notice that these won't work as standard eigenvalues, as we need to have real eigenvalue components for our vectors. But instead, this motivates us to use rotation as a way to model this transformation.
Let be the transformation that rotates a vector radians. Note that we can find the matrix under the standard basis as:
We can verify that for any vector . First notice that for any that:
and then, reapplying the definition:
So then:
Thus:
as requested.
10
Theorem
Suppose and is an eigenvector of (with eigenvalue ). Suppose . Then .
Proof
Since is an eigenvector of then:
We will prove the theorem by induction on . Say , so then:
Notice then that:
Notice that since is an eigenvector then:
Now consider the following cases. If then clearly:
and similarly:
Hence , showing the theorem holds.
If then we need to show that for some . Notice that:
And notice that:
Hence still .
Thus, the base case is proved, so suppose that for all that the theorem holds, where . Consider this new case. Then:
and thus:
Thus we use the inductive hypotheses below:
Thus showing the inductive step. As such, by induction, the theorem holds for all .
☐