Studying vectors on their own is boring. Now, we'll look at mappings from vectors to other vectors.
For example, consider , the space of polynomials with real coefficients. This is a countably-infinitely-dimensional vector space.
Now let denote the calculus derivative map. So then such that it's defined by:
and if you remember power rule, we know how to take the derivative of a polynomial. In fact, we know that taking this derivative would be closed under derivation, as we get a polynomial as an output.
actually has some really nice properties:
. This comes from calculus where .
. This comes from calculus where .
Because these 2 properties are satisfied, then we say is a linear map or a linear transformation. Sometimes you'll hear that is "linear", and that's that this means.
These two properties come up so often that we'll generalize it:
Linear Transformation
Given a vector space and a vector space , where are over the same field , then the function is a linear transformation iff:
for all
for all
Note that there are two different signs in (1). The adds vectors in , while are adding vectors in .
Some notation: sometimes you want to have the collection of linear maps from to . We denote it as to denote this collection of linear maps from to . Furthermore, if is , then instead of saying we can just say .
So then, as an example, .
But, okay this is mind blowing, itself is a vector space. That's because:
The zero function is in our space.
Adding two functions gives a function that's also a linear map.
There's an additive inverse for every
...
Okay let's see this in more detail. If we had some function where then the additive inverse would be the function where . Or, given a function , define by . Since then is the additive inverse of and vice versa.
Properties of Linear Maps
Here are some properties of linear maps:
Properties of Linear Maps
For the following, we say , or .
. This is because , or . Notice that the zero vectors may be different, as they may come from different vector spaces.
.
.
Let's look at an example. Consider , the transformation rotates an input vector CCW by radians. We can show that this is linear transformation, via a picture:
You could check property 2 (scalar multiplication) too and get that that holds too.
Let's look at another example. Let's work in , the space of continuous functions with domain . We'll define a function , defined by:
where are fixed, real numbers. We claim that is linear:
.
.
So if we look at , then . But notice that here were arbitrary. If we have another one where we have defined in then it's still a linear transformation. But if we consider then that itself is still a linear transformation! You can see how meta this can get...
LT's Defined by Their Bases
Linear Transformations Defined on the Actions of their Basis
Given a vector space with basis , then given any vectors , in a vector space (all over the same field ), there is a unique transformation satisfying:
for all .
We'll do part of the proof today:
Proof
This is an existence/uniqueness proof. We'll have to define a linear transformation. Suppose were a LT with the property that for all . Let be an arbitrary vector. Then since is a basis, there's a linear combination of the basis elements that construct :
where each exists and is unique. Then:
but since :
so if there were two , doing this process yeilds that they are the same.
But is there one? We can just define as:
and all we would have to show is that is linear and for all . The second just comes from the definition, but the linear part requires a bit more work.