Lecture 9 - Linear Independence and Span
We'll cover linear independence and span today.
Span
Define , where the argument is a list of vectors, as the set of linear combinations of those vectors, ie:
Note that:
- This collection of the span of some vectors is a subspace. (, closure under addition and scalar multiplication, see Chapter 2 - Finite-Dimensional Vector Spaces#2.A Span and Linear Independence).
- It is also the smallest subspace per Chapter 2 - Finite-Dimensional Vector Spaces#Linear Combinations and Span.
- If we were to define some subspaces (ie: scalar multiples of ), then we get that .
- There is a unique way to write any vector in precisely when is direct!
There's some language here to deal with span:
If we say that we say that spans . Another way to say this is to say that is a spanning list for .
Let's look at some examples:
Let be the vector space of matrices with real entries, so . Let be the subset of symmetric matrices with real entries. So if it's of the form:
Is a subspace? Yes, due to all properties being considered. But can we find a spanning list ?
Proof
Yes you can, using:
We can prove that . We show both and . The is free since itself is a subspace, so it must contain its span.
For the latter, we need to show that we can take any vector such that . So then it must be of the form:
For , so then .
☐
Note that isn't the only spanning list. We could add say:
and still is a spanning list.
But here the uniqueness of the combinations of is interesting to us. But what if we had:
These is still a spanning list. In general, if you know that , then is the same (provided that ).
So ideally:
- We choose spanning lists that have the fewest number of vectors
- We choose vectors such that they are non-scaled (although in the grand scheme of things this is arbitrary).
This brings us to...
Linear Independence
are linearly independent if for all we have:
We already know . But this definition implies that this is the only way to write . Why is this the case? See Chapter 2 - Finite-Dimensional Vector Spaces#Linear Independence. If you wish to prove that are linearly independent:
- Suppose that .
- From that, determine that all .
When a definition is given in math, "if" is equivalent to "iff"!
is linearly dependent if is not Linear Independent (LI).
The negation is that there exist some not all 0 such that:
Consider . Are these LI or LD? They are dependent since:
But use the definition:
Notice that this is using the definition, still:
Which isn't interesting and doesn't use the definition.
Note that (the empty list) is LI, as the definition is vacuously satisfied. Also the list itself is LI or LD, not the vectors themselves.
Suppose I have a collection of vectors . This list is LD since we can put any scalar in front of 0:
So any list containing the zero vector is LD.