Skip to Content

Convergent Sequences

If we analyze the values of a sequence we can see that certain behaviors or patterns can occur such as the sequence becoming monotonically increasing or decreasing or the sequence staying within a certain range, i.e. being bounded between two values.

Another pattern we may notice is that the terms of the sequence get closer and closer to a certain value as the index \(n\) increases. Let’s consider the following sequence and its terms:

\[a_n = \frac{2n+3}{n} \text{ with } n \in \mathbb{N} \]
\(a_1\)\(a_2\)\(a_3\)\(a_{10}\)\(a_{1000}\)\(a_{100000}\)
53.532.32.0032.00003

As \(n\) grows larger, the terms of the sequence get closer and closer to 2. This value is called the limit of the sequence. If the terms of the sequence get closer to some value as \(n\) increases, then we say the sequence converges to that value. If the terms of the sequence do not get close to a certain value, then we say the sequence diverges. Another common phrasing is that as \(n\) approaches infinity, the terms of the sequence approach the limit. We can formally write this as:

\[\lim_{n \to \infty} \frac{2n+3}{n} = 2 \]

Another common way to refer to the limit is by saying that the limes of the sequence is 2. The limes is the latin word for limit.

Epsilon-Neighborhood

To formally define the limit of a sequence, we first introduce the concept of an epsilon neighborhood (or epsilon strip). This is a region around a suspected limit with a radius of \(\epsilon\), where \(\epsilon > 0\).

Epsilon neighborhood around the limit a with radius epsilon and entry index
Epsilon neighborhood around the limit a with radius epsilon and entry index

The index of the term that first enters this neighborhood is called the dipping number or entry index, and is denoted as \(N_{\epsilon}\) or \(N_0\). So for example if we have a sequence \(a_n\) that converges to a limit \(L\), then for every \(\epsilon > 0\) we can define the neighborhood around \(L\) as:

\[\{n \in \mathbb{N} | \quad |a_n - L| < \epsilon\} \text{ or } \{n \in \mathbb{N} | a_n \in (L-\epsilon, L+\epsilon)\} \]

The entry index \(N_{\epsilon}\) is the index of the term that first enters this neighborhood. This means that for all \(n \geq N_{\epsilon}\), the terms \(a_i\) are in the neighborhood around \(L\), i.e. \(a_i \in (L-\epsilon, L+\epsilon) \forall i \geq N_{\epsilon}\).

Example

For a given sequence \(a_n = \frac{2n+3}{n}\) and \(\epsilon = 0.1\), we can calculate the entry index \(N_{0.1}\) by solving the inequality:

\[\begin{align*} \left|\frac{2n+3}{n} - 2\right| < 0.1 \\ \left|\frac{3}{n}\right| < 0.1 \\ \frac{3}{n} < 0.1 \\ 3 < 0.1n \\ 30 < n \end{align*} \]

So for \(\epsilon = 0.1\), the entry index \(N_{0.1} = 31\). We can check this by calculating the terms of the sequence for \(n=29\), \(n=30\), and \(n=31\):

  • \(a_{29} = \frac{2*29+3}{29} = 2.1034\)
  • \(a_{30} = \frac{2*30+3}{30} = 2.1\)
  • \(a_{31} = \frac{2*31+3}{31} = 2.0968\)

The entry index is 31 not 30 as the terms must be less than 0.1 away from the limit.

This is what D’Alembert and cauchy used to make the first definition of a limit. A sequence \(a_n\) converges to a limit \(L\) if for every \(\epsilon > 0\) there exists an entry index \(N_{\epsilon}\) such that for all \(n \geq N_{\epsilon}\) the terms \(a_n\) are in the \(\epsilon\)-neighborhood around \(L\). In other words, after a certain point in the sequence, all terms are within \(\epsilon\) of the limit. More formally:

\[L \text{ is the limit of } a_n \text{ if } \forall \epsilon > 0, \exists N_{\epsilon} \in \mathbb{N} \text{ such that } \forall n \geq N_{\epsilon}, |a_n - L| < \epsilon \]

Another way to define the limit is to say that a sequence \(a_n\) converges to a limit \(L\) if for every \(\epsilon > 0\) there exists a finite number of terms in the sequence that are outside the \(\epsilon\)-neighborhood. These two definitions are equivalent. As in the first definition, there are a finite many terms outside the neighborhood, all terms where \(n \leq N_{\epsilon}\) and which then means there are an infinite number of terms inside the neighborhood. We can define the elements outside the neighborhood just like we did with the elements inside the neighborhood:

\[\{n \in \mathbb{N} | \quad |a_n - L| \geq \epsilon\} \text{ or } \{n \in \mathbb{N} | a_n \notin (L-\epsilon, L+\epsilon)\} \]
Note

An important note is that the limit needs to be a real number if it is a real valued sequence. The limit can not be infinity as infinity is not a real number. So if a sequence converges to infinity, it is divergent. We can also define this more formally. A sequence \(a_n\) converges/diverges to \(+\infty\) if for every \(T > 0\) there exists an entry index \(N_T\) such that for all \(n \geq N_T\) the terms are:

\[\lim_{n \to \infty} a_n = +\infty \text{ if } \forall T > 0, \exists N_T \in \mathbb{N} \text{ such that } \forall n \geq N_T, a_n > T \]

The same holds for \(-\infty\) if \((-a_n)\) converges to \(+\infty\).

If a sequence converges to the limit 0 then we say that the sequence is a null sequence. These are important sequences to look at as they are the building blocks for many other sequences and also later for series.

For each sequence the limit is unique. So if a sequence converges to a limit, then this limit is unique. The proof of this is rather intuitive. Let’s assume that a sequence converges to two different limits \(L\) and \(M\). Then if we define an epsilon so that the neighborhood around \(L\) and \(M\) do not overlap. For example if \(\epsilon = \frac{|L-M|}{2}\) then the neighborhoods around \(L\) and \(M\) do not overlap.

\[]L-\epsilon, L+\epsilon[ \cap ]M-\epsilon, M+\epsilon[ = \emptyset \]
Epsilon neighborhood around two limits L and M
Epsilon neighborhood around two limits L and M

We know that by definition that all terms outside of an epsilon neighborhood are finite and the terms inside the neighborhood are infinite. However, if the neighborhoods do not overlap then the infinite epsilon neighborhood around \(L\) is part of the finite elements outside the neighborhood around \(M\) and vice versa. This is a contradiction and therefore the limit must be unique.

We can also show that all sequences that converge are bounded. intuitively this might make sense to some but we can also prove this. If a sequence converges to a limit \(L\) then we can choose \(\epsilon = 1\). We then know that there are infinite many terms in the neighborhood around \(L\). So these values are bounded by \(L+1\) and \(L-1\). We then also know that there are only a finite number of terms outside of this neighborhood and that these together cover the whole sequence. To find the bounds we then need to find the maximum and minimum of the finite terms. We know that these values are larger than \(L+1\) and smaller than \(L-1\) as otherwise they would be in the neighborhood. So the sequence is bounded (For real valued sequences, otherwise if one of the bounds is infinity then the sequence is not bounded).

Illustration of the idea that convergent sequences are bounded
Illustration of the idea that convergent sequences are bounded

However, the other way around is not true. Not all bounded sequences converge. For example the sequence \(a_n = (-1)^n\) is bounded between -1 and 1 but does not converge. This is because the terms keep switching between -1 and 1 and do not get closer to a certain value.

\[a_n \text{ converges} \implies a_n \text{ is bounded} \]
Constant Sequence

For the constant sequence such as \(a_n = 5\) the limit is the constant itself, so in this case 5. This is rather obvious but we can also prove it. By definition we know that after some specific index \(N_{\epsilon}\) all terms have to be within the epsilon neighborhood around the limit and because epsilon is larger than 0 so get the following that needs to be satisfied for all \(n \geq N_{\epsilon}\):

\[|a_n - L | < \epsilon \implies |a_n - L| = 0 < \epsilon \]

In this case \(L\) is simply 5. This can also be generalized to any constant sequence:

\[(a_n)_{n \geq 1} = c \implies \lim_{n \to \infty} a_n = c \]

Because the following always holds:

\[\begin{align*} |a_n - c| = |c - c| = 0 \forall n \geq 1 \\ |a_n - c| < \epsilon \forall n \geq 1 \text{ for any } \epsilon > 0 \end{align*} \]
Harmonic Sequence

The next sequence we can look at is the following:

\[(a_n)_{n \geq 1} = \frac{1}{n} = 1, \frac{1}{2}, \frac{1}{3}, \ldots \]

This is the so called harmonic sequence.

Todo

Show desmos graph.

Can this proof really not be done without assuming that the limit is 0?

For this sequence again we can get \(|a_n - L| < \epsilon\) for all \(n \geq N_{\epsilon}\). Because of Archimedes principle we know that for any \(x > 0\) there exists a \(n \in \mathbb{N}\) such that \(y \leq nx\) or \(y \geq nx\) for all \(y \in \mathbb{R}\). In this case the first inequality is important. If we set \(x = \epsilon\) and \(y = 1\) then we know the following:

\[\begin{align*} y \leq nx \\ 1 \leq N_{\epsilon} \epsilon \\ \frac{1}{N_{\epsilon}} \leq \epsilon \end{align*} \]

So we know that some index \(N_{\epsilon} \in \mathbb{N}\) exists such that:

\[\frac{1}{N_{\epsilon}} < \epsilon \]

Which then means that for all \(n \geq N_{\epsilon}\) the following holds:

\[|a_n - 0| = \frac{1}{n} \leq \frac{1}{N_{\epsilon}} < \epsilon \]

So the limit of the sequence is 0. We call such sequences that converge to 0 null or zero sequences. This is a very important class of sequences as they are a building block for convergent series. So we formally say that a sequence is a null sequence if:

\[\lim_{n \to \infty} a_n = 0 \]

We have seen that some index \(N_{\epsilon}\) exists such that for all \(n \geq N_{\epsilon}\) the harmonic sequence is within the epsilon neighborhood around 0. We can also calculate this index for a given epsilon. For example if we had \(\epsilon = 0.8\) then we can calculate $N_0.8 by doing the following:

\[\begin{align*} \frac{1}{N_{0.8}} < 0.8 \\ \frac{1}{N_{0.8}} < \frac{8}{10} \\ \frac{5}{10} < \frac{1}{N_{0.8}} \\ \frac{1}{2} < \frac{1}{N_{0.8}} \end{align*} \]

So \(N_{0.8} = 2\) which means that for \(\epsilon = 0.8\) all terms after \(n=2\) are within the epsilon neighborhood around 0.

Example

From the script: using archimedes principle to show that the limit of \(a_n = \frac{n}{n+1} = 1\)

Divergent Alternating Sequence

We have seen lots of examples of sequences that converge to a limit. But what about sequences that do not converge? Let’s look at the following sequence:

\[(a_n)_{n \geq 1} = (-1)^n = -1, 1, -1, 1, \ldots \]

Intuitively it is clear that this sequence does not converge as the terms keep switching between -1 and 1. We can also prove this by contradiction. Let’s assume that the sequence converges to a limit \(L\). We know that \(|a_n - a_{n+1}| = 2\) for all \(n \in \mathbb{N}\). The the following must hold for \(\epsilon > 0\) and \(\forall n \geq N_{\epsilon}\):

\[|a_n - L| < \epsilon \]

Let’s assume by contradiction that a limit exists and that \(\epsilon = \frac{1}{2}\). Then we know that there exists an index \(N_{\epsilon}\) such that for all \(n \geq N_{\epsilon}\) the following holds:

\[\begin{align*} 2 &= |a_n - a_{n+1}| \\ &= |a_n - L + L - a_{n+1}| \\ &\leq |a_n - L| + |L - a_{n+1}| \\ &\leq \epsilon + \epsilon = 1 \\ 2 &\leq 1 \]

This is a contradiction and therefore the sequence does not converge. Note that we just added 0 by adding and subtracting \(L\) and then used the triangle inequality.

Arithmetic Sequence

Lastly let’s look at the sequence \((a_n)_{n \geq 1} = n\). This sequence does not converge as the terms keep getting larger and larger. We can prove this by contradiction. Let’s assume that the sequence converges to a limit \(L\). Then for \(\epsilon = 1\) we know that there exists an index \(N_{\epsilon}\) such that for all \(n \geq N_{\epsilon}\) the following holds:

\[|a_n - L| < 1 \]

However, this already does not hold for the first term as \(|a_1 - L| = |1 - L| \geq 1\) is not possible. So the sequence does not converge.

This sequence is actually a special case of the arithmetic sequence. An arithmetic sequence is a sequence where the difference between two consecutive terms is constant. In this case the constant is 1 but we could also for example look at the sequence \(a_n = 2n + 3\) as an arithmetic sequence with a common difference of 2. We can also write this more generally as:

\[a_n = a_1 + (n-1)d \text{ with } d \in \mathbb{R} \text{ and } n \in \mathbb{N} \]

Properties of Convergent Sequences

Sind \(a_n\) und \(b_n\) konvergente Folgen mit den Grenzwerten \(a\) bzw. \(b\), so ist auch die Folge:

  • \(c*a_n\) konvergent mit \(\lim_{n \to \infty} {c*a_n} = c*\lim_{n \to \infty} {a_n} = c*a\) für \(c \in R\)

  • \(a_n \pm b_n\) konvergent mit \(\lim_{n \to \infty}{a_n \pm b_n}={{\lim_{n \to \infty}{a_n}} \pm {\lim_{n \to \infty}{b_n}}}={a \pm b}\)

Example from her notes with \(a_n = n \over n+1\) and splits it into \(1 \over n\) and \(1 \over n+1\) and then shows that the limit is the sum of the limits.

  • \(a_n *b_n\) konvergent mit \(\lim_{n \to \infty}{a_n* b_n}={{\lim_{n \to \infty}{a_n}} * {\lim_{n \to \infty}{b_n}}}={a * b}\)

Example from her notes with \(a_n = (1 + \frac{1}{n})^b\)

  • \(a_n \over b_n\) konvergent mit \(\lim_{n \to \infty}{a_n \over b_n}={{\lim_{n \to \infty}{a_n}} \over {\lim_{n \to \infty}{b_n}}}={a \over b}\) falls \(b \neq 0\)

Comes from the above. example could be \(a_n = \frac{n^2 - 2n}{n^2 + n + 1}\)

  • If there exists a \(K \geq 1\) and \(a_n \leq b_n\) for all \(n \geq K\) then \(\lim_{n \to \infty}{a_n} \leq \lim_{n \to \infty}{b_n}\) in other words \(a \leq b\).

  • Das Produkt einer beschränkten Folge und einer Nullfolge ist immer eine Nullfolge.

Squeeze Theorem

The squeeze theorem or sometimes also called the sandwich theorem states that if we have two sequences \((a_n)_{n\geq 1}\) and \((c_n)_{n\geq 1}\) that have the same limit \(L\) so:

\[\lim_{n \to \infty} a_n = \lim_{n \to \infty} c_n = L \]

and a third sequence \((b_n)_{n\geq 1}\) for which after a certain index \(K\) the following holds:

\[a_n \leq b_n \leq c_n \text{ for all } n \geq K \]

Then the sequence \((b_n)_{n\geq 1}\) also converges to \(L\) so:

\[\lim_{n \to \infty} b_n = \lim_{n \to \infty} a_n = \lim_{n \to \infty} b_n = L \]

This is a very simple but powerful theorem that can be used to show that many sequences converge to a certain limit. The idea is that if we can find two sequences that are always above and below the sequence we are interested in and these two sequences converge to the same limit, so they slowly get closer and closer to each other and squeeze the sequence we are interested in, then the sequence we are interested in also converges to the same limit.

Illustration of the idea behind the squeeze theorem
Illustration of the idea behind the squeeze theorem
Example

Let’s look at the sequence \(a_n = \frac{sin(n)}{n}\). We know that the sine function is bounded between -1 and 1 and then that the following inequalities hold:

\[\frac{-1}{n} \leq \frac{sin(n)}{n} \leq \frac{1}{n} \]

As \(n\) goes to infinity the terms on the left and right side go to 0. So by the squeeze theorem the sequence \(a_n = \frac{sin(n)}{n}\) also converges to 0.

\[\lim_{n \to \infty} \frac{sin(n)}{n} = 0 \]

Monotone Convergence Theorem

Theorem by Karl Weierstrass:

If the sequence \((a_n)_{n\geq 1}\) is monotonically increasing and bounded above, then it converges. More precisely we can determine the limit as:

\[\lim_{n \to \infty} a_n = \sup\{a_n | n \geq 1\} \]

The same holds for monotonically decreasing sequences that are bounded below. Here the limit uses the infimum instead of the supremum.

\[\lim_{n \to \infty} a_n = \inf\{a_n | n \geq 1\} \] \[a_n \text{ is monotone} \land a_n \text{ is bounded} \implies a_n \text{ converges} \]

The other direction does not hold. For example the sequence \(a_n = (-1)^n \frac{1}{n}\) converges to 0 but it is only bounded by 1 and -1 and not monotonically increasing or decreasing as it oscillates between -1 and 1.

Illustration of the idea behind the monotone convergence theorem
Illustration of the idea behind the monotone convergence theorem

Polynomial vs Exponential

\(n^bq^n = 0\) Shows that any power of \(n\) is smaller than any exponential function important for algorithms.

\(n^{1\over n} = 1\)

Limits of Recursive Sequences

From notes more generally \(a_n = 1 \over 2 (a_{n-1} + c \over a_{n-1})\) and \(a_1 = c\) using Weierstrass \(c > 1\).

recursive defined sequences \(a_{n+1} = 1 \over 2 (a_n + 2 \over a_n)\) goes to \(\sqrt{2}\)

Euler’s Number

Lots of possible origins but one is analyzing the compound interest formula.

First comes Bernoulli’s Inequality.Does this come from probability theory?

\[(1+x)^n \geq 1+nx \text{ for all } x \geq -1 \text{ and } n \in N \] \[\lim_{n \to \infty} (1+{1\over n})^n = e \]

Limes Superior and Inferior

Using the monoton-convergence theorem we can perform a very important operation. If we are given a bounded sequence \(a_n\) we can define two new sequences \(b_n\) and \(c_n\) as follows for all \(n \geq 1\):

\[b_n = \inf\{a_k | k \geq n\} \text{ and } c_n = \sup\{a_k | k \geq n\} \]

Because a_n is bounded we know that \(b_n\) and \(c_n\) are also bounded as the infimum and supremum are always the largest lower bound and smallest upper bound of the subsequence.

Example

Let’s look at the sequence \(a_n = (-1)^n (1 + \frac{1}{n})\). This sequence is bounded between -2 and 2 and does not converge as it oscillates between positive and negative values. However, we can define the two sequences \(b_n\) and \(c_n\) as follows:

\[\begin{align*} b_1 &= \inf\{a_k | k \geq 1\} = \inf\{-2} = -2 \\ b_2 &= \inf\{a_k | k \geq 2\} = \inf\{-2, 1.5\} = -2 \\ b_3 &= \inf\{a_k | k \geq 3\} = \inf\{-2, 1.5, -1.333\} = -2 \\ b_4 &= \inf\{a_k | k \geq 4\} = \inf\{-2, 1.5, -1.333, 1.25\} = -2 \\ \vdots \\ c_1 &= \sup\{a_k | k \geq 1\} = \sup\{-2\} = -2 \\ c_2 &= \sup\{a_k | k \geq 2\} = \sup\{-2, 1.5\} = 1.5 \\ c_3 &= \sup\{a_k | k \geq 3\} = \sup\{-2, 1.5, -1.333\} = 1.5 \\ c_4 &= \sup\{a_k | k \geq 4\} = \sup\{-2, 1.5, -1.333, 1.25\} = 1.5 \\ \vdots \end{align*} \]

From the example we can see that the sequence \(b_n\) and \(c_n\) are monotonic. This makes sense because as knew values are added to the set the infimum will only get smaller and the supremum will only get larger. So we can actually be more precise and say that \(b_n\) is monotonically increasing and \(c_n\) is monotonically decreasing and that they are both bounded. The fact that they are monotonic also follows from the following properties of the supremum and infimum where \(A \subseteq B\):

\[\begin{align*} \text{If B is bounded from above then } \sup(A) &\leq \sup(B) \\ \text{If B is bounded from below then } \inf(A) &\geq \inf(B) \end{align*} \]

In this case the set \(A\) corresponds to the sequence at time \(k\) and the set \(B\) corresponds to the sequence at time \(k+1\). So we can see that the infimum of the set at time \(k\) is always less than or equal to the infimum of the set at time \(k+1\) and the supremum of the set at time \(k\) is always greater than or equal to the supremum of the set at time \(k+1\). This means that the sequence \(b_n\) is monotonically increasing and the sequence \(c_n\) is monotonically decreasing.

\[\begin{align*} b_n &\leq b_{n+1} \\ c_n &\geq c_{n+1} \end{align*} \]

Because they are both monotonic and bounded we can apply the monotone convergence theorem and say that they converge to a limit. We call these limits of these specially created sequences the Limes Superior and Limes Inferior of the original sequence. The Limes Superior is the limit of the sequence \(c_n\) and the Limes Inferior is the limit of the sequence \(b_n\). We can write this as:

\[\begin{align*} \limsup_{n \to \infty} a_n &= \lim_{n \to \infty} c_n \text{ Limes Superior} \\ \liminf_{n \to \infty} a_n &= \lim_{n \to \infty} b_n \text{ Limes Inferior} \end{align*} \]

We can interpret the Limes Superior and Limes Inferior as the largest and smallest limit points of the sequence. So in a way as \(n\) goes to infinity the Limes Superior is like an upper bound that the sequence approaches and the Limes Inferior is like a lower bound that the sequence approaches.

Because we also know that the infimum is always less than or equal to the supremum we can also say that all terms of the \(b_n \leq c_n\) and therefore the limit of the sequence \(b_n\) is less than or equal to the limit of the sequence \(c_n\):

\[\liminf_{n \to \infty} a_n \leq \limsup_{n \to \infty} a_n \]

We can also use the Limes Superior and Limes Inferior to determine if a sequence converges or diverges. First let’s just look at a bounded sequence. We have already seen that we can create two sequences \(b_n\) and \(c_n\) that are monotonic and bounded and therefore converge to a limit. Using these two sequences we can define the Limes Superior and Limes Inferior of the original sequence which are like upper and lower bounds of the sequence but as \(n\) goes to infinity. So if the Limes Superior and Limes Inferior converge to the same limit, then it is as if these bounds are slowly squeezing the sequence together. I.e they are both getting closer to a value, sounds like a limit.

So we can say that if the Limes Superior and Limes Inferior of a bounded sequence converge to the same limit, then the original sequence also converges. That the sequence is bounded is important as otherwise the Limes Superior and Limes Inferior could diverge to infinity and would be the same but the original sequence would not converge. So we can say that:

\[\text{If } \limsup_{n \to \infty} a_n = \liminf_{n \to \infty} a_n \text{ and } a_n \text{ is bounded then } a_n \text{ converges} \]
Proof

We can also prove this. Let’s assume that the Limes Superior and Limes Inferior converge to the same limit \(L\). Then we know that \(b_n\) is monotonically increasing and converges to \(L\) and \(c_n\) is monotonically decreasing and converges to \(L\). So we can say that:

\[L - \epsilon < b_n \leq a_n \leq c_n < L + \epsilon \text{ for all } n \geq N_{\epsilon} \]

This would then imply that:

\[L - \epsilon < a_n < L + \epsilon \equiv |a_n - L| < \epsilon \text{ for all } n \geq N_{\epsilon} \]

We can also show the other way around. If the sequence converges to a limit \(L\) then we know that for all \(\epsilon > 0\) there exists an index \(N_{\epsilon}\) such that for all \(n \geq N_{\epsilon}\) the following holds:

\[|a_n - L| < \epsilon \]

So eventually all terms are within the epsilon neighborhood around \(L\), \((L - \epsilon, L + \epsilon)\). This means that the terms of the sequence are also within the epsilon neighborhood around the supremum and infimum.

Cauchy Criterion

We have seen a few ways now how to calculate the limit of a sequence or determine if a sequence converges based on certain properties that the sequence has, such as being bounded and monotonic. However, there is another way to determine if a sequence converges or diverges. This is called the Cauchy criterion.

Previously we looked to see if the Limes Superior and Limes Inferior of a sequence converge to the same limit. Cauchy’s idea is similar to this but instead of looking at the Limes Superior and Limes Inferior we look at the distance between two terms of the sequence. The idea is that if the terms of the sequence get closer and closer together, then they must converge to a limit.

This is what the Cauchy criterion states. A sequence is convergent if it is a so called Cauchy sequence or the sequence is Cauchy. A sequence is Cauchy if the following holds:

\[\forall \epsilon > 0 \exists N_{\epsilon} \in \mathbb{N} \text{ such that } \forall n, m \geq N_{\epsilon} \text{ we have } |a_n - a_m| < \epsilon \]

So if we can find an index \(N_{\epsilon}\) such that for all terms after this index the distance between any two terms is less than \(\epsilon\), then the sequence converges.

Todo

As an example let’s look at the harmonic sequence \(a_n = \frac{1}{n}\). We know that this sequence converges to 0. But we can also use the Cauchy criterion to show that it converges. Let’s assume that we have a sequence \(a_n\) and we want to show that it is Cauchy. We can do this by showing that the distance between two terms of the sequence is less than \(\epsilon\).

\[\begin{align*} |a_n - a_m| < \epsilon \\ \frac{1}{n} - \frac{1}{m} < \epsilon \\ \frac{1}{k} < \epsilon \end{align*} \]

So depending on how we set the epsilon we can find an index \(N_{\epsilon}\) such that for all terms after this index the distance between any two terms is less than \(\epsilon\).

Cauchy-Cantor Theorem

Todo

Weird stuff with intervals that can then be used to show that real numbers are uncountable.

Balzano-Weierstrass Theorem

Todo

Every bounded sequence has a convergent subsequence.

Vector Sequences

So far we have looked at sequences of real numbers, often called real sequences. However, we can also look at sequences where the elements are vectors, these are called vector sequences. More formally we can define a vector sequence in \(\mathbb{R}^d\) as:

\[a: N \to R^d, n \mapsto a_n = (a_{n1}, a_{n2}, \ldots, a_{nd}) \text{ with } a_{ni} \in R \]

We can define the limit of a vector sequence in a similar way as we did for real sequences, we just need to replace the absolute value with the norm of the vector. So we can say that a vector sequence converges if there exists a vector \(L \in \mathbb{R}^d\) such that:

\[\forall \epsilon > 0 \exists N_{\epsilon} \in \mathbb{N} \text{ such that } \forall n \geq N_{\epsilon} \text{ we have } ||a_n - L|| < \epsilon \]

Just like with real sequences, this limit is unique and is written as:

\[\lim_{n \to \infty} a_n = L \]

At first you might think that limits of vectors could be rather complicated as we are now working with multiple dimensions. However, it turns out that the limit of a vector sequence is rather simple. The limit of a vector sequence is simply the limit along each dimension or coordinate/component.

If we have the vector \(L=(l_1, l_2, \ldots, l_d)\) and a vector sequence \(a_n\), think of the vector sequence as a matrix with \(d\) rows and \(n\) columns, then if each row \(i\) is a sequence itself, it’ll converge to the limit \(l_i\) of that sequence. So the following two statements are equivalent:

  1. The vector sequence \(a_n\) converges to the vector \(L\), \(\lim_{n \to \infty} a_n = L\).
  2. \(\lim_{n \to \infty} a_{n,i} = l_i\) for all \(i=1,2,\ldots,d\).
Example

Let’s define a vector sequence as follows:

\[a_n = (\frac{1}{n}, 1 + \frac{1}{n}, (1 + \frac{1}{n})^2) \text{ with } n \geq 1 \]

If we now look at the first component of each term of the sequence we get:

\[(a_{n,1})_{n \geq 1} = (\frac{1}{n})_{n \geq 1} = (1, \frac{1}{2}, \frac{1}{3}, \ldots) \]

We already know this sequence as the harmonic sequence and we know that it converges to 0. The same holds for the sequence of the second components and the third component. So we can say that the vector sequence converges to the vector \(L=(0, 1, 1)\) as the limit of the vector sequence is simply the limit of each component. So we can write:

\[\lim_{n \to \infty} a_n = (\lim_{n \to \infty} a_{n,1}, \lim_{n \to \infty} a_{n,2}, \lim_{n \to \infty} a_{n,3}) = (0, 1, e) \]

We can also look at a vector sequence that does not converge. For example let’s look at the following sequence:

\[a_n = (\frac{1}{n}, (-1)^n, n) \text{ with } n \geq 1 \]

This sequence does not converge as the second component oscillates between -1 and 1 and the third component diverges to infinity.

We already know that for real convergent sequences we can perform certain operations on the sequences and the limits of these sequences. The same holds for vector sequences. We can perform the same operations on vector sequences as we did for real sequences. So for the convergent vector sequences \(a_n\) and \(b_n\) we can say that:

\[\begin{align*} \lim_{n \to \infty} (a_n \pm b_n) &= \lim_{n \to \infty} a_n \pm \lim_{n \to \infty} b_n \\ \lim_{n \to \infty} (c a_n) &= c \lim_{n \to \infty} a_n \end{align*} \]

Where \(c\) is a constant. This is rather intuitive as if each of the components of the vector converges to a limit, then the sum of the vector sequences is as if we would shift the limit of the vector sequence by the limit of the other vector sequence. The same holds for the product of a constant and a vector sequence.

Bounded Vector Sequences

For real sequences we were able to say that a sequence is bounded if it is bounded above and below. So for some upper bound \(u\) and lower bound \(l\) we had for a bounded real sequence:

\[a_n \in [l, u] \text{ for all } n \in \mathbb{N} \]

In other words, no term ever became infinitely large or small. For vector sequences we can define the same concept. A vector sequence is bounded if there exists a vector \(R \in \mathbb{R}^d\) such that:

\[||a_n|| \leq R \text{ for all } n \in \mathbb{N} \]
Todo

If the sequences of the components are bounded is the vector sequence also bounded?

Example

An example of a bounded vector sequence is the following:

\[a_n = (\frac{1}{n}, \sin(n)) \text{ with } n \geq 1 \]

Intuitively As \(n\) goes to infinity the first component converges to 0 and the second component oscillates between -1 and 1. So we can say that the vector sequence is bounded. We can also show this by calculating the norm of the vector sequence:

\[||a_n|| = \sqrt{(\frac{1}{n})^2 + (\sin(n))^2} \leq \sqrt{(\frac{1}{n})^2 + 1} \leq \sqrt{2} \text{ for all } n \geq 1 \]

An example of an unbounded vector sequence is the following:

\[a_n = (n, (-1)^n) \text{ with } n \geq 1 \]

Again intuitively we can see that the first component diverges to infinity and the second component oscillates between -1 and 1. So we would say that the vector sequence is unbounded. We can also show this by calculating the norm of the vector sequence:

\[||a_n|| = \sqrt{n^2 + (-1)^2} = \sqrt{n^2 + 1} \geq n \mapsto \infty \text{ as } n \to \infty \]

Because we know that vector sequences can be bounded it mean can also use the Cauchy criterion and the Bolzano-Weierstrass theorem to show that a vector sequence converges. However, we can not use the monotone convergence theorem as we can not just simply define an order on the vector space. So we can not say that a vector sequence is monotonic.

Todo

Examples of using the theorems for vector sequences.

Complex Sequences

Todo

Complex sequences are similar to vector sequences but with complex numbers. What are the rules?

She showed:

  1. \(\lim_{n \to \infty} z_n = z\) then \(\lim_{n \to \infty} \overline{z_n} = \overline{z}\)
  2. \(\lim_{n \to \infty} z_n = z\) then \(\lim_{n \to \infty} |z_n| = |z|\)
  3. \(\lim_{n \to \infty} (z_n + w_n) = z + w\) if \(z_n \to z\) and \(w_n \to w\)
  4. \(\lim_{n \to \infty} (z_n * w_n) = z * w\) if \(z_n \to z\) and \(w_n \to w\)
  5. \(\lim_{n \to \infty} (z_n / w_n) = z / w\) if \(z_n \to z\) and \(w_n \to w\) and \(w \neq 0\) and \(w_n \neq 0\) for all \(n \geq N_{\epsilon}\)

Example:

\[\lim_{n \to \infty} (\frac{n^2 + 2}{n^3 + 1} + i \frac{n^3 + 2n + 1}{n^3 + n + 1}) = 0 + i = i \]
Todo

Todo these two examples still need to be integrated somewhere as an example.

Are there any more examples of well known and useful sequences?

Geometric Sequences

Folgen der Form: \(a_n= a_1 *q^{n-1}\) sind geometrische Folgen. Jedes Glied ist das geometrische Mittel seiner beiden Nachbarglieder \(a_n=\sqrt {a_{n-1}+a_{n+1}}\)

Eine geometrische Folge \(a_n= a_1* q^{n-1}\)

  • mit \(|q|>1\) ist divergent
  • mit \(|q|<1\) ist konvergent mit Grenzwert 0
  • mit \(q=1\) ist eine konstante Folge \(a_1\)
  • mit \(q=-1\) ist divergent, da alternierend.

Rational Sequences

Für eine rationale Folge, die im Zähler aus einem Polynom k-ten Grades und im Nenner aus einem Polynom l-ten Grades besteht, gilt:

\[ \lim_{n \to \infty}{{a_kn^k+a_{k-1}n^{k-1}+...+a_0}\over{b_ln^l+b_{l-1}n^{l-1}+...+b_0}} = \begin{dcases} {a_k\over b_k} *\infty, falls\space k >l \\ {a_k\over b_k} , falls\space k=l \\ 0 , falls\space k<l \end{dcases} \]
Last updated on