Consider the direction. Consider some , where we know the list is LI. Then the only way to write:
is . Notice that by contradiction if then a possible to choose is:
which contradicts the list being LI. So then .
Consider the direction. Say . Assume for contradiction that there's some nonzero such that . Then since we can divide both sides by to get:
Which contradicts , so then the only choice we have is that , so then by definition then the the only choice of alpha for is , so the list is LI.
b: A list of two vectors in is LI iff neither vector is a scalar multiple of the other
Consider the direction. Consider where . Say the list is LI, so then the only where:
is both . Assume for contradiction that one vector is a scalar multiple of the other (which one it is doesn't matter here). So then let's say that for some non-zero scalar .
First, we should show that both . Notice if one of them was the zero vector, let's say , then the linear combination:
wouldn't require that as the choice and would work. This contradicts the list being LI. As such, then . Thus, since we are considering then as if then which is another contradiction. Thus, .
But notice that:
but we know from above that and so then we've found non-zero scalar that linearly combine to make the zero-vector, suggesting the list was LD the whole time. This is a contradiction, so then clearly is false, so each vector isn't a scalar multiple of the other.
Now, consider the direction, so suppose neither vector of is a scalar multiple of the other, so no nor for any . Again, we can show that neither can be the zero vector. If one of them was, say , then notice that which contradicts not being a scalar multiple of . WLOG, then as well.
So we know . Notice then that their linear combination:
for what values of ? Assume for contradiction that one . Then notice then that:
WLOG, let's say that is non-zero. Then we can divide it and see that:
but this contradicts not being a scalar multiple of , so then clearly . WLOG, then too. Thus, both , showing that the list of is LI.
c: is LI in .
Let the first vector be the second and the third . Consider choices of to make:
This requires, by expanding, that:
Creating 4 equations:
Thus, all , so the list is LI.
d: The list is LI in for each non-negative integer .
Let . Then consider when:
We can substitute our values in:
equating coefficients yields:
So all are actually 0, so then the list of vectors above is LI.
Proof
Suppose is LI in , and . Suppose is LD, so there are some not all 0 such that:
Rearranging:
If then that would imply that for some non-zero which is a contradiction. Hence, . Therefore:
So .
☐
# 11
Theorem
Suppose is LI in , and . Then is LI iff .
Proof
Suppose is LI in , and . So then:
only when all . For this proof, we need to prove both directions.
For suppose is LI. Now, assume for contradiction that , so then there exists at least one non-zero constant such that:
But notice that we can move the over to create:
But look! At least one isn't 0, and with the constant being added to the vector, it's clear that this implies that is not LI, which contradicts our supposition that it is. Therefore, then we must have it that .
Now, consider , so then suppose that . Assume for contradiction that instead is LD. Then there exist at least one non-zero constant such that:
But look! Consider the cases where is or isn't 0. If then we have:
But since is LI then all which contradicts us having at least one-non zero constant as is LD. Now instead consider when . Then notice that we can solve for :
But this would contradict as this is a valid linear combination of constants to get .
Therefore, no matter what, we have a contradiction, so then we must have it that is instead LI.
☐
We know that there's not a list of 6 polynomials that are linearly independent in since we can make a spanning list that only uses 5 polynomials, so by the lemma that the length of a LI list is the length of the spanning list, then at most any LI list of polynomials from this space must have 5 or less vectors.
Now for constructing this spanning list. I chose . Notice that these are LI since if we consider:
then the only solution is all , so then the list is LI. Furthermore, it is a valid spanning list, since given any arbitrary we can choose the constants:
Refer to the LI spanning list for . We made a list of 5 items long, so again via the same lemma that the length of a LI list is the length of a spanning list, then via our example then any spanning list must be to our LI list which was 5 items long. Thus, it is impossible for a 4 item list to exist that spans our space in question.
is infinite-dimensional (ie: not finite-dimensional) iff there is an infinite sequence of vectors in such that is LI for every positive integer .
Proof
We need to prove both directions.
First the direction. Suppose that is infinite-dimensional. Thus, there exists no finite list of vectors from such that spans the space .
We need to show that given some infinite sequence of vectors all in , that cutting off gives a LI list for any positive integer . Consider the base case of just , which is clearly LI by proxy. Let our inductive hypothesis suggest that for all that is LI. Consider . Since is infinite-dimensional, then cannot span some new vector , so therefore adding it in gives us a LI list of , completing the inductive step.
For the () direction, suppose that there is an infinite sequence of vectors in such that is LI for every positive integer . Notice that via contradiction, if we assumed that had a finite dimension, then we could make a new sequence whereby our supposition, that new list would be LI, and since a spanning list length is a LI list length, then any vectors spanned by clearly missed a vector from , so then isn't the dimension of so isn't finite dimensional.
☐
Suppose are polynomials in , such that for each . Then is not LI (so it is LD) in .
Proof
We'll try to prove this via induction over .
As a base case, consider when . The we have such that . Notice that since we are working in , we are only working with constant functions. As such, since then , so the list is LD.
Now let's do the inductive step. Let be arbitrary. Our inductive hypothesis is that for polynomials in where all then is LD. Consider the case, where we have the list where the new . Consider trying to add the polynomials such as to add to the zero polynomial:
for all . Now, if is the zero function then we are done as we would have the equation that:
where since is LD via our inductive hypothesis, then we can choose all our constants such that we have at least one non-zero , and for we can set that to any value, even 0. Thus, our new equation would contain non-zero linear combinations of our vectors to get the zero vector, so then we have shown that is LD.
Now for the other case, suppose that isn't the zero function. We still can just choose so then we'd get:
where again via our inductive hypothesis, then there will be at least one non-zero coefficient to show that our whole contains at least one non-zero choice. As such, then the whole must be LD.
This completes the proof via mathematical induction.
☐
Suppose are subspaces of , where . Suppose that is a basis of and is a basis of . Then:
is a basis of .
Proof
We need to show that is both LI and spans .
First, span. Notice by definition that:
so given any , then there's some representation via:
so clearly is in the span of the new list.
Now for LI. Consider when we have:
Notice that we can group elements together:
Let and likewise . Then:
But that implies that , so then wouldn't be a direct sum as any element would be in as well (it is a subspace after all), and thus since then there would be two representations of ( and ), violating the direct sum rules.
As such, then we must have it that as only zero-vectors are allowed to follow that while still adhering to the direct sum rules. Hence, since and are bases, then all from before, showing LI.
☐
Suppose is finite-dimensional and is a subspace of such that . Then .
Proof
We know for some . Then there's some basis and a basis . But since is a subspace of , then , so then since each then each . as such, it's clear that is a basis for as well. As such, then the span of equates to the span of , and since then .
☐
so then we have it that , so let's choose a simple basis that satisfies that. Notice that if we plug then we get:
we can rewrite our terms into:
Which are our vectors. For not having to deal with fractions, I'll use the following basis. Consider . We'll prove this is a basis for .
For LI, consider when:
Equating coefficients gives that:
Clearly for the RHS we have , so then we have LI.
Now for span. Let be arbitrary. Then and . Doing the analysis before, we see that we have . But notice that:
So then:
showing that of our vectors.
b
Extend the vector to our basis.
We know we are looking for a set of 5 vectors as . Hence, we just show that our total basis is LI.
Consider :
Equating coefficients gives:
so it's LI, and thus a basis.
c
Let . We'll show that .
First, clearly as if we take LC's from both then we get any vector from , as we'll show. Let be arbitrary:
Then notice that:
Thus so then .
Now we can show a direct sum via . Notice that , so let be a non-zero vector (for contradiction).
Then , equating both definitions of . But then that means that to equate the terms on both sides. As such, then , so then which is a contradiction. Hence so then .
Suppose are finite-dimensional subspaces of . Then is finite dimensional and:
Proof
We prove this via induction. Let be arbitrary.
For the base case, consider when . Then clearly which satisfies the theorem in .
For the inductive hypothesis, suppose the theorem holds for all . Now consider the case. Notice that:
and since the left set is and the right set is then via the inductive hypothesis then their entire sum is finite-dimensional. Furthermore, we can use the same principle to say that:
Suppose are finite-dimensional subspaces of such that is a direct sum. Then is finite dimensional as:
Proof
Let's use induction similar to (14), of which this problem is very similar and already have the version:
Let . Since we have a direct sum then so then , so then:
Which satisfies the theorem.
Now for our inductive hypothesis. Let be arbitrary, and say that that the theorem holds for such a .
Consider . Notice that is a valid set and would also be a valid set, of size , whose overall parts are clearly a direct sum. Thus we can use our inductive hypothesis: