We'll do some 'baby proofs' of some common Linear Algebra findings. But first, some examples of vector spaces:
Example
is the set of all polynomials with coefficients in . In fact, we claim is a vector space over under the usual addition and scaling of polynomials. For instance:
In , an example is . Another polynomial is .
We can add to get .
We can scale by to get: .
The zero polynomial would be
Example
is the set of all polynomials of degree or less, with coefficients from . We have the usual polynomial addition and scalar multiplication as defined in the example prior.
Note that for Example 2 above, you need to have the condition of " or less" since, if it was strictly (say for our case), you could take say and but .
Also note that we can create a isomorphism from to by:
Matching to our constant coefficient
Matching to our coefficient
...
So, if we consider in Example 2 above, we can turn them into vectors in , and then add them together:
So behaves like .
Proofs in Linear Algebra
Note that . This isn't part of our definition, so we can't say it's a fact yet. But we need to show that this comes as a consequence of our definition. First, a note on the cancellation law, which says that:
Cancellation Law
Suppose where is a vector space. Then:
This also isn't from our definitions, so let's prove it:
Proof
Suppose . Then, we know the additive inverse for exists, which we call . Then, notice the following:
☐
Note that, from now on, we can just say:
(ie: you can you the sign on it's own now. Yay!)
Now we prove : Proof
So then you can replace to show that:
☐
Let's do another proof:
Theorem
.
Proof
Note that:
So since , then we can subtract from both sides and get .
☐
Theorem
. We note the is the additive inverse of .
Proof is indeed the additive inverse if, when we add , we get the zero vector. Notice: