Lecture 28 - Continuing Inner Product Spaces

Review of Last Time...

Consider the inner product space V, as defined in Chapter 6 - Inner Product Spaces#^a64259, equipped with , as the inner product. There's an array of properties that we saw in Chapter 6 - Inner Product Spaces#Inner Products, allowing us to define the norm via:

v=v,v

which is the norm induced by the inner product.

norms in a higher context

Normally norms are defined in the same way in other fields of mathematics, namely things like Real Analysis and Abstract Algebra refer to this definition.

Here, if u,v=0 iff u,v are orthogonal. Review the Pythagorean Theorem:

Pythagorean Theorem

Suppose u,v are orthogonal vectors in V. Then:
$
||u + v||^2 = ||u||^2 + ||v||> $

Which we proved in Chapter 6 - Inner Product Spaces#^4c14b5 below.

We also talked about orthogonal decomposition, which is talked in our notes via Chapter 6 - Inner Product Spaces#^315ed2,

where we had:

u=u,vv2vorange vector+uu,vv2vblue vector

Notice that the orange vector is what we've seen in Calc III. We saw:

projvu=uvv2v

which here now is more generalized.

Cauchy-Schwartz Inequality

We'll prove the following:

Cauchy-Schwartz Inequality

Given two vectors u,vV:

|u,v|uv

where we have equality when u,v are scalar multiples of each other.

Proof
If either v=0 is then we get an easy equality, so suppose v0. Thus:

u=u,vv2v+w

we know that the left vector in the sum is parallel to v, while w is the other vector which is orthogonal to v. As such, then we can use the Pythagorean Theorem between the left and right vectors:

u2=u,vv2v2+w2

The whole scalar in the left vector can come out:

u2=|u,vv2|2v2+w2=|u,v|2v2+w2

Notice that the left term in the sum is non-negative, and w is also non-negative. Hence:

u2|u,v|2v2

Thus, multiplying over and taking the square root:

uv|u,v|

Notice that we only get equality when u,v are scalar multiples of each other, since then w is perpendicular to v while pointing in the u direction, so then w2=0 so w=0.

Bring this back to our standard dot product, where:

|ab|=|a||b|cos(θ)|ab|=|a||b||cos(θ)||a||b|

since 0|cos(θ)|1.

Triangle Inequality

We do something similar for this new theorem:

Triangle Inequality

u+vu+v

where equality holds when u,v are positive multiples of one another.

Proof
Consider u+v2 first:

u+v2=u+v,u+v=u,u+v,v+u,v+v,u=u,u+v,v+2R(u,v)(Adding two conjugates gives 2R of z)u2+v2+2|u,v|(Real Part is smaller than magnitude)u2+v2+2uv(Cauchy-Schwartz Inequality)=(u+v)2

Square rooting both sides gives the theorem above.

Intro to 6.B: Unit Norms

Recall that we used unit vectors to do a lot of things, so we define that below:

orthonormal

A list of vectors that are pairwise orthogonal (orthogonal to each and every one of each other) and are all of norm 1 is called orthonormal.

Here it just means that:

For instance, the list e1,e2,...,en ie the standard basis in Rn is orthonormal.

Suppose a different list e1...,en is an orthonormal list for an inner product space V. Suppose that we took a linear combination of these:

α1e1++αnen

if we wonder what the norm of this vector:

α1e1+orthogonal+αnenorthogonal2=α1e1+2+αnen2

which we repeat n1 times to get:

α1e1++αnen2=i=1kαiei2

So taking the square roots, we notice that finding norms of vectors in this format allows us to find the norms very easily (just add the sums of the separate normals).