Jump to content

Alternating series test

From Wikipedia, the free encyclopedia
(Redirected from Leibniz test)

In mathematical analysis, the alternating series test is the method used to show that an alternating series is convergent when its terms (1) decrease in absolute value, and (2) approach zero in the limit. The test was used by Gottfried Leibniz and is sometimes known as Leibniz's test, Leibniz's rule, or the Leibniz criterion. The test is only sufficient, not necessary, so some convergent alternating series may fail the first part of the test.[1][2][3]

For a generalization, see Dirichlet's test.[4][5][6]

Formal statement

[edit]

Alternating series test

[edit]

A series of the form

where either all an are positive or all an are negative, is called an alternating series.

The alternating series test guarantees that an alternating series converges if the following two conditions are met:[1][2][3]

  1. decreases monotonically[a], i.e., , and
  2. .

Alternating series estimation theorem

[edit]

Moreover, let L denote the sum of the series, then the partial sum approximates L with error bounded by the next omitted term:

Proof

[edit]

Suppose we are given a series of the form , where and for all natural numbers n. (The case follows by taking the negative.)[8]

Proof of the alternating series test

[edit]

We will prove that both the partial sums with odd number of terms, and with even number of terms, converge to the same number L. Thus the usual partial sum also converges to L.

The odd partial sums decrease monotonically:

while the even partial sums increase monotonically:

both because an decreases monotonically with n.

Moreover, since an are positive, . Thus we can collect these facts to form the following suggestive inequality:

Now, note that a1a2 is a lower bound of the monotonically decreasing sequence S2m+1, the monotone convergence theorem then implies that this sequence converges as m approaches infinity. Similarly, the sequence of even partial sum converges too.

Finally, they must converge to the same number because .

Call the limit L, then the monotone convergence theorem also tells us extra information that

for any m. This means the partial sums of an alternating series also "alternates" above and below the final limit. More precisely, when there is an odd (even) number of terms, i.e. the last term is a plus (minus) term, then the partial sum is above (below) the final limit.

This understanding leads immediately to an error bound of partial sums, shown below.

Proof of the alternating series estimation theorem

[edit]

We would like to show by splitting into two cases.

When k = 2m+1, i.e. odd, then

When k = 2m, i.e. even, then

as desired.

Both cases rely essentially on the last inequality derived in the previous proof.

Examples

[edit]

A typical example

[edit]

The alternating harmonic series

meets both conditions for the alternating series test and converges.

An example to show monotonicity is needed

[edit]

All of the conditions in the test, namely convergence to zero and monotonicity, should be met in order for the conclusion to be true. For example, take the series

The signs are alternating and the terms tend to zero. However, monotonicity is not present and we cannot apply the test. Actually, the series is divergent. Indeed, for the partial sum we have which is twice the partial sum of the harmonic series, which is divergent. Hence the original series is divergent.

The test is only sufficient, not necessary

[edit]

Leibniz test's monotonicity is not a necessary condition, thus the test itself is only sufficient, but not necessary. (The second part of the test is well known necessary condition of convergence for all series.)

Examples of nonmonotonic series that converge are:

In fact, for every monotonic series it is possible to obtain an infinite number of nonmonotonic series that converge to the same sum by permuting its terms with permutations satisfying the condition in Agnew's theorem.[9]

See also

[edit]

Notes

[edit]
  1. ^ In practice, the first few terms may increase. What is important is that for all after some point,[7] because the first finite amount of terms would not change a series' convergence/divergence.
  1. ^ a b Apostol 1967, pp. 403–404
  2. ^ a b Spivak 2008, p. 481
  3. ^ a b Rudin 1976, p. 71
  4. ^ Apostol 1967, pp. 407–409
  5. ^ Spivak 2008, p. 495
  6. ^ Rudin 1976, p. 70
  7. ^ Dawkins, Paul. "Calculus II - Alternating Series Test". Paul's Online Math Notes. Lamar University. Retrieved 1 November 2019.
  8. ^ The proof follows the idea given by James Stewart (2012) “Calculus: Early Transcendentals, Seventh Edition” pp. 727–730. ISBN 0-538-49790-4
  9. ^ Agnew, Ralph Palmer (1955). "Permutations preserving convergence of series" (PDF). Proc. Amer. Math. Soc. 6 (4): 563–564.

References

[edit]
[edit]