The function that takes to is not an inner product on .
Proof
We'll show that one of the properties isn't satisfied. Namely, the first two are easily shown to be satisfied no matter what. But notice that the additivity in the first slot is not. Consider and and . Then:
while:
hence the property of additivity in the first slot is not satisfied, so the function is not a valid inner product of .
☐
2
Theorem
The function that takes to is not an inner product of .
Proof
We easily can show that if that clearly while:
so the property of definiteness is not satisfied.
☐
3
Consider when and . We replace positivity () with the condition that for some . We'll show that the set of functions from that are inner products on do not change.
Proof
Since then the conjugate symmetry property says that for any . We also still have the additivity and homogeneity in the first slot, as well as definiteness.
Let be this arbitrary inner product operation. If we can show, using the properties prior to the change, that we get the new property, and if we use the new property along with the others to prove the switched-out property, then the proof is complete.
First, consider our properties before the change. Notice that since then clearly so then there must be some other non-zero vector . Hence, by positivity then . Notice that so then by definiteness we must have so then we must have as required.
Now, consider the properties after the change (so we cannot use positivity like we did before). Hence, there is some where . Since this is the case, clearly then . As such, then must contain the zero-vector and at least one other non-zero vector . Notice that if we let be arbitrary, if then clearly and if then assume for contradiction that .
Consider the following cases. If is a linear combination of , then so then:
which contradicts our assumption. Suppose instead that is not a linear combination of , so then the vector since if it was we'd get a linear combination. As such, notice that:
But this equals 0, so use the quadratic equation to try to solve for in terms of everything else:
Since and then:
so then specifically. So choosing this gives:
which contradicts , so then our assumption was wrong. Hence .
☐
4
Suppose is a real inner product space.
a
Theorem
for all
Proof
Notice:
☐
b
Theorem
If have the same norm, then is orthogonal to .
Proof
Suppose have the same norm, so then . We'll show that :
☐
c
Theorem
The diagonals of a rhombus are perpendicular to each other.
Suppose is such that for all . Then is invertible.
Proof
If we can show that cannot have an eigenvalue of , then we've shown that therefore is invertible. As such, assume for contradiction that is an eigenvalue of . As such, there's some associated eigenvector . As such, then , but notice then that:
which contradicts our supposition that , so then our assumption was false. Hence, cannot be an eigenvalue of , so then is invertible.
☐
7
Theorem
Suppose . Then for all iff .
Proof
The forward direction is easiest. Notice if and then we get our desired result:
Let's prove the reverse direction. Suppose . Let be arbitrary. Notice that:
Square rooting both sides gives the desired result.
☐
8
Theorem
Suppose and and . Then .
Proof
Suppose the necessary assumptions. Notice that if we prove that then we've proved the above theorem:
☐
10
Theorem
There exist vectors such that is a scalar multiple of and is orthogonal to and .
Proof
Using the idea of orthogonalization, here have . Notice then that since is orthogonal to then:
Further we know that:
Add to get:
Thus:
Where our properties above are validated by these vectors.
☐
12
Theorem
For all and all :
Proof
Consider . Notice that:
Furthermore, consider . Here, by Cauchy-Schartz:
☐
13
Theorem
Suppose in . Then:
where is the angle between .
Proof
We draw out our triangle:
Using the law of cosines:
☐
17
Theorem
There is no inner product on such that the associated norm is given by:
Proof
Notice that if then:
Must be such an inner product. We can try to verify each property as follows.
Positivity:
Where here is the maximum of . Thus, this inner product would have positivity.
Definiteness: Suppose . Then that means that so then . As such, then . It's possible though that and to allow for this. Notice that plugging that in gives that while so then there's no definiteness.
Hence, if there was some where the norm is defined above, then it would fail to have at least one of the properties of the inner product, proving the theorem.
☐
18
Theorem
Suppose . Then there is an inner product on such that the associated norm is given by:
for all iff .
Proof
We have to prove both direction. The reverse direction is easier. Consider first. Then choose the dot product giving that, for all that:
Thus then we get that as we wanted.
Now consider the other direction, so there is an inner product on where the associated norm is given by:
We'll have to show that . Notice that we can plug in values for . Have and . Notice by the Parallelogram Equality:
which would be the same as:
Hence:
as required.
☐
19
Theorem
Suppose is a real inner product space. Then:
Proof
Dividing by 4 gives the desired result.
☐
20
Theorem
Suppose is a complex inner product space. Then:
Proof
Dividing by 4 on both sides gives the result.
☐
24
Theorem
Suppose is an injective operator on . Define by:
for . Then is an inner product on .
Proof
We'll show all the properties of it being an inner product.
(Positivity): Let . Then:
by the definition of positivity.
(Definiteness): Let . Then clearly if as we showed then so we get . If then by definiteness from then . And since then by being injective. Hence, the new inner product is definite.
(Additivity in the first slot): Let be arbitrary. Then:
(Homogeneity in the first slot): Let and . Then:
(Conjugate Symmetry): Let :
☐
25
Theorem
Suppose is not injective. Define as in the prior exercise. is not an inner product.
Proof
Since is not injective, then there are two such that but . As such, then notice that if (for the forward part of definiteness), then by definiteness from then . Thus, then , so . But so then:
while but , which is a contradiction. As such, then our inner product isn't an inner product.
☐
6.B: Orthonormal Bases
1
a
Theorem
Suppose . Then and are orthonormal bases of .
Proof
We'll show that these are orthonormal, by showing their norms are 1 and that they are orthogonal vectors.
First, for norm:
Thus all vectors in question have norm 1.
For orthogonality, notice that:
and:
thus each pair of vectors in question are orthogonal to each other.
☐
b
Theorem
Each orthonormal basis of is of the form given by the possibilities from (a)
Proof
Let be arbitrary, where we want to form an arbitrary orthonormal basis of . Hence, we need and . Since their norms are all 1, that means must lie somewhere on the unit circle:
There's some between the x-axis and , so for simplicity say that and . As a result:
Which is the first vector of both candidate bases! Notice that would have to follow similar logic, via some other angle :
Now we'll show has to be one of the other two forms. Notice that:
Using the trigonometry identity:
There's only two really different possibilities, when , and . If :
Showing that as the first candidate shows.
Using instead:
Thus as the second case shows.
Notice that and everything were arbitrary, so then these must be the only two candidates!
☐
3
Theorem
Suppose has an UT matrix with respect to the matrix . There's an northonormal basis of with respect to which is upper triangular.
Proof
We apply Gram-Schmidt to the given basis to product our orthonormal basis. We are guaranteed that is upper-triangular via Chapter 6 - Inner Product Spaces#^b22ab5.
As such, start with . Here as it's the scalar multiple of where it's norm is 1.
Now consider . Here it's numerator is:
we'll normalize to:
Notice that the numerator for is:
Then just normalize our vectors:
☐
4
Theorem
Suppose is a positive integer. Then:
is an orthonormal list of vectors in , the space of continuous real-valued functions on with inner product:
Proof
Notice that we'll generalize all the . Notice that:
And a similar proof shows the same for the sine part:
And lastly we compare both the cosine and sine parts:
since the function in the integral is an even function (no matter if or not, as when there's equality we have which is also an odd function).
☐
5
Theorem
On , consider the inner product given by:
Apply the Gram-Schmidt Procedure to the basis to produce an orthonormal basis of
Proof
Start with . Notice that:
So . Next:
Where to normalize :
So . Now for :
Where this norm is:
Thus we have:
☐
6
Theorem
Find an orthonormal basis of our previous questions vector space, such that the differentiation operator on has an upper-triangular matrix with respect to this basis.
Proof
Notice that the standard basis of has the differentiation operator of:
which is already upper-triangular. One can show that the current basis isn't orthonormal (just orthogonal), so we'll apply Gram-Schmidt to the problem, which we did in HW 7 - Inner Product Spaces#5, giving us the basis:
What happens if the Gram-Schmidt Procedure is applied to a list of vectors that is not linearly independent?
Proof
Since we'll get vectors, given some starting number of vectors , we'll get . If we have that some of the 's are linear combinations of the others, then when we linearly combine previous vectors while applying the procedure, we may create vectors that are just linearly dependent on each other. Hence, will be linearly dependent. As a result, then some may not be orthogonal to each other (since there'll be linear combinations to get from one set of to a new ).
☐
12
Theorem
Suppose is finite-dimensional and are inner products on with corresponding norms and . Then there is a positive number such that:
for all .
Proof
Let be arbitrary. Consider first:
and similar for the second norm.
Consider the following cases. If then and we are done. Hence, consider if . Since both are real, positive numbers, then by the Ordering of the Reals, then we have either or .
First, consider the former case. Hence:
If then because then we have it that as a property of the inner product. As such, then so then , which contradicts them being non-equal
If instead then notice that:
But since both numerator and denominator are positive, then it's greater than 0. Hence, choose ; so:
which is true.
Hence, no matter what for our former case, we can always choose . If instead we have:
Again, we know that , so then:
So choose as the left-hand side of this inequality, since:
Hence, by proof by exhaustion, the theorem always holds.
☐