Lecture 19 - Finishing Decomposition, Some Applications
Recall:
polar decomposition
Given , then there exists an isometry such that:
In an essence, is a simple operator, so and are very similar. We want to look at the operator in more detail to work with, since we know a lot of its properties even when is a mess.
Notice that is a positive operator, so it must have non-negative eigenvalues. We used this to make:
singular values of
The singular values of some are the eigenvalues of with each eigenvalue repeated times.
In particular, there are exactly singular values, and they are all non-negative. Because of this, we can order them, so we typically write them in descending order:
or more traditionally:
An Example
Let be defined by:
Thus:
Further:
Notice then:
Notice that we know that is positive, as shown since is symmetric. It has eigenvectors:
With eigenvalues:
So then the matrix with respect to the eigenbasis with each is the normalized :
Thus:
Notice we have the singular values on the digagonal our singular values, namely:
So the singular value decomposition is as follows, but we'll need a definition.
Singular Value Decomposition
singular value decomposition
Suppose , has singular values . Then there exist orthonormal bases and such that:
Notice it's doing a scaling and a swap. It swaps all the 's with 's and then scales respectively by .
In general:
Notice that:
This implies that every operator has a matrix that is diagonalizable!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Proof
The operator is:
positive
self-adjoint
So there is an ONEB corresponding to the eigenvalues for .
If our arbitrary then:
Apply the isometry (there may be many, but just pick one) from the polar decomposition to get:
where we can just define .
☐
In the matrix language, this is what Singular-Value Decomposition (SVD) looks like.
Singular-Value Decomposition (SVD)
Given an complex matrix , then there exist matrices such that:
is an rectangular matrix with non-negative real numbers on the diagonal (namely, the singular values)
is an complex unitary matrix
Such that:
You can't use 'macro parameter character #' in math modeM = USV^{*} $${ #e2c23a}
Here unitary means:
unitary
Unitary means . The matrix representation of any isometry is always unitary.
Notice if is real, then since then where are real-orthogonal () matrices.
An Example
Namely:
Notice if we order the 's from biggest to smallest, then for large then our , so then we could just take a partial sum of these and get an approximation for . As such, define a matrix as this approximation:
As a result, looking at the diagram above, we would only need the first columns of and first rows of . This is called a rank approximator of .