Give an example of such that is the only eigenvalue of and the singular values of are .
Proof
Consider:
Then:
Notice that here is the only eigenvalue, with eigenvector in this case. However, calculating :
Thus . Namely:
This is confirmed by calculating manually:
Thus:
Thus the eigenvalues of are 0 and 25, so then it's singular values are the square roots, namely 0 and 5 are the singular values for .
☐
4
Question
Suppose and is a singular value of . Prove that there exists a vector such that and .
Proof
Since is a singular value of , then is the eigenvalue of by definition, and thus there exists a corresponding eigenvector to , we'll name . Thus . But notice:
Thus:
So notice this. Choose where . Notice that is still an eigenvector of with eigenvalue , so the finding above works if we replace all the 's with 's. But notice that now:
Our chosen vector as required.
Because of this, plugging into the equation up top gives that as required.
☐
5
Question
Suppose is defined by . Find the singular values of .
Proof
We should first find . Notice for a fixed and arbitrary :
Thus . We should now determine and then find the square roots of these eigenvalues. First notice that , so:
As such, then we have a diagonal matrix, so the eigenvalues are on the diagonal. Thus, with the corresponding eigenvectors as the standard basis vectors for .
Therefore, our singular values are the square roots of these:
☐
6
Question
Find the singular values of the differentiation operator defined by , with the inner product given by:
Proof
The example 6.33 gives the standard basis for :
Notice what the differentiation operator does to our standard basis vectors:
Thus:
So:
Thus our singular values, from largest to smallest, are:
☐
7
Question
Define by:
Find (explicitly) an isometry such that .
Proof
For this case we can just deal with everything in terms of matrices since we're finite dimensional! Notice:
Finding :
Thus :
which makes sense! As such:
Since this is a diagonal matrix, clearly then:
Notice that can be inverted:
Multiplication of (both ways) gives the identity to verify this. As such, we can assume, and later verify, that:
This I claim that I choose the transformation:
is a valid isometry where . To validate this, first let be arbitrary. Then:
using the Pythagorean Theorem (our 's are orthonormal). Thus, square rooting both sides gives so is a valid isometry.
To show that , let be arbitrary. We know that :
So since was arbitrary, it follows .
☐
8
Question
Suppose , while is an isometry, and is a positive operator such that . Prove that .
Proof
Using polar decomposition, we already know that since that there is some isometry such that:
Furthermore, we know from the question that we are given and where:
Thus:
Thus, notice that since both are isometries, by their properties then they are invertible and and likewise . So then and . Consider:
Thus we have . Since is a positive operator then is self-adjoint. Thus so then . Then has a unique positive square root which is .
☐
10
Question
Suppose is self-adjoint. Prove that the singular values of equal the absolute values of the eigenvalues of , repeated appropriately.
Proof
Since is self-adjoint then so then only has real-valued eigenvalues . For our cases, say that and are eigenvalues of , with some repeats possible.
Since is self-adjoint, then no matter what by either Spectral Theorem we have an orthonormal basis consisting of eigenvectors of . We'll label them in this case, and each gets paired with it's corresponding eigenvalues . In essence:
Since is self-adjoint, then . As result, then because it's a possible square root (we don't know if it's unique or not).
Consider an eigenvector of , we'll denote . We'll show that . Notice that it has an eigenvalue . Notice that:
As such notice that:
So is an eigenvector of . We know that we have an eigenbasis for namely thus .
To end, let be a singular value of . Thus it's an eigenvalue of . We know any is an eigenvector of both transformations. Thus:
But:
Since all then . Thus . Notice that cannot be negative since singular values are nonnegative square roots of the eigenvalues of .
☐
14
Question
Suppose Then is equal to the number of nonzero singular values of .
: Suppose . Since then so such that . We can write where and . But since then notice that where . Thus:
Because notice that since then . As such, we can always choose vector here such that , so then .
: We know that .
Thus we know that . Since is diagonalizable as it is self-adjoint, then the number of nonzero singular values of is the dimension of .
☐
17
Question
Suppose has singular value decomposition given by:
for every where are the singular values of and and are orthonormal bases of .
Prove that if , then:
Prove that if , then:
Prove that if , then:
Suppose is invertible. Prove that if , then:
for every .
Proof
(1) Notice that:
for all . Notice then that if and then we have:
Thus:
Because all so the conjugate operation does nothing. As a result, then reading the matrix this implies:
As a result, because we can use the bases then we can say that, using Graham Schmidt:
(2) Apply to the left of via the matrices:
As such then:
Using Graham Schmidt:
(3) Since is positive, then is unique. From (2), we know that the eigenvalues are given by for each eigenvector . As such 's eigenvalues are just the square roots of 's eigenvalues, which is because singular values are non-negative. As such, then:
by this finding, so then applying Graham Schmidt:
(4) Apply and and see they are the identity:
thus is the correct definition for the inverse of .
☐