Before our definition, consider an infinite sum . Then the sequence of partial sums is just the first -terms summed up:
So given the definition of a convergent sequence, we'll have a definition for a convergent series as well.
Series
An infinite series is a formal expression of the form:
To this we associate a sequence of partial sums is the sequence defined as:
We say that converges to and write:
To mean that .
The difficulty here is that is usually very hard to compute. But let's look at a simple one.
Example: Geometric Series
Let . Consider:
Recall that in general that: (multiply out the right side to verify). With that then:
So long as (which diverges anyways) then:
Then from our talk on Monotonic Sequences:
Gives us that when then:
So if then so then , allowing us to make this deduction.
If instead then is unbounded, so then the sequence has to diverge (since all sequences are bounded, so unbounded must imply divergent).
If then it diverges since it's also unbounded as . If then:
Which is divergent since we can use the definition of convergence and show that we can choose such that any bound is missing or , showing divergence.
Consider for a moment the series where , and. As a result, we have to have be increasing (since there's no negative terms to decrease the partial sum), and thus monotonic. Using the Monotonic Sequences then the only way converges iff is bounded above (obviously it is bounded below already).
So the name of the game to show the series converges is to show an upper bound for the partial sums. Similarly, to show it diverges, show that no such upper bound can exist.
Series converges Bounded Partial Sums
A series where and for all converges iff , the sequence of partial sums, is bounded above.
Proof
See our remark above.
☐
Example 2
Consider the series:
It's a -series with so we expect it to converge. But let's show it converges. We'll show that there's a bound for . Notice that:
Notice that:
Thus then is bounded above by , so then the series converges. Not only that, but by the Limits and Order (Order Limit Theorem), then it converges to some value less than 2.
Example 3: Harmonic Series
Consider:
we know that this diverges, but let's prove it. We have to show that is unbounded. Then:
Clearly as then (diverges) so then must also diverge since it's unbounded.
Example 4: The General Case
Now consider the abstract, general case of where and . Let . Then let similar to our previous example:
So if then . Showing that is unbounded would imply that is unbounded, showing a divergent series.
We could do a bound in the other direction too:
So if then .
There are some cautionary tales that will help develop more rigorous definitions here.
Cautionary Tales
The First Tale
Consider the alternating harmonic series:
from calculus we can use the AST to show that this converges. Specifically it is given that it converges to , but this isn't important so say it converges to :
Now multiply the series by :
Now add the halves from the equation with either other with the in the equation:
is a rearrangement of the terms of the AHS. It seems that rearranging the series gives a different number.
Summary
The order of added terms dictates what number the series converges to.
Because is uncountable, then we could reorder this in an uncountable order. This suggests that we could order these in any way, such that we could get any real number out. In fact, we claim that:
Claim
Given any , there is some rearrangement of the AHS such that it adds up to .
This is crazy! This seems to break commutativity that we know in the finite sense.
The Second Tale
What if we do double summations (similar to that of doing double integrals)? Consider the series:
Let's make it add to zero by doing a out front:
Remark that putting a out front doesn't change the sum:
and we can keep adding zeroes:
Putting these in a giant grid:
-1
1/2
1/4
1/8
1/16
...
0
-1
1/2
1/4
1/8
...
0
0
-1
1/2
1/4
...
0
0
0
-1
1/2
...
...
...
...
...
...
...
Let denote the entry in the -th row and -th column. What is ?
If we go rows first then they all add to 0, by our arguments above. For the column first, we have:
1st column: -1
2nd column: -1/2
3rd column: -1/4
...
-th column:
If we add all the values in the rows first, we expect since . So right?If we add columns first instead we get:
So what is it? Is the series 0 or -2?
The Third Tale
Consider the series:
We could add in groups like:
But we could also group like:
so again which is it?
The Conclusion
The moral of these tales is that we need to be precise about what it means to converge! As such then