Lecture 28 - Continuing Inner Product Spaces
Review of Last Time...
Consider the inner product space
which is the norm induced by the inner product.
Normally norms are defined in the same way in other fields of mathematics, namely things like Real Analysis and Abstract Algebra refer to this definition.
Here, if
Which we proved in Chapter 6 - Inner Product Spaces#^4c14b5 below.
We also talked about orthogonal decomposition, which is talked in our notes via Chapter 6 - Inner Product Spaces#^315ed2,
where we had:
Notice that the orange vector is what we've seen in Calc III. We saw:
which here now is more generalized.
Cauchy-Schwartz Inequality
We'll prove the following:
Given two vectors
where we have equality when
Proof
If either
we know that the left vector in the sum is parallel to
The whole scalar in the left vector can come out:
Notice that the left term in the sum is non-negative, and
Thus, multiplying over and taking the square root:
Notice that we only get equality when
☐
Bring this back to our standard dot product, where:
since
Triangle Inequality
We do something similar for this new theorem:
where equality holds when
Proof
Consider
Square rooting both sides gives the theorem above.
☐
Intro to 6.B: Unit Norms
Recall that we used unit vectors to do a lot of things, so we define that below:
A list of vectors that are pairwise orthogonal (orthogonal to each and every one of each other) and are all of norm 1 is called orthonormal.
Here it just means that:
- all vectors are orthogonal (perpendicular) to each other
- all vectors are unit length
For instance, the list
Suppose a different list
if we wonder what the norm of this vector:
which we repeat
So taking the square roots, we notice that finding norms of vectors in this format allows us to find the norms very easily (just add the sums of the separate normals).