Prove that the sequence xn+1=a1+xnx_{n+1}=\frac{a}{1+x_n} with a,x0>0a,x_0>0 converges

Im having a lot of trouble with this exercise… indeed I think that the statement is false. The unique tools that I have to solve this exercise is the algebra of limits, the ϵ−N\epsilon{-}N definition of convergence of a sequence and the theorem that says that if a sequence is monotone and bounded then converges.

If (xn)→x(x_n)\to x then x=−1+√1+4a2x=\frac{-1+\sqrt{1+4a}}2. From some tests and watching the graph of the function 11+x\frac{1}{1+x} the sequence certainly is alternating.

Then I tried to prove that the sequence (xn+2−xn)(x_{n+2}-x_n) is monotone but I failed. Some hint or solution, with the tools already mentioned, will be appreciated. Thank you.

=================

=================

3 Answers

3

=================

Hint: Let ξa=−1+√1+4a2\xi_a=\frac{-1+\sqrt{1+4a}}{2} and fa(x)=a1+xf_a(x)=\frac{a}{1+x}.

Prove that if xn<ξax_n<\xi_a, then fa(xn)=xn+1>ξaf_a(x_n)=x_{n+1}>\xi_a and vice-versa;

Prove that fa(fa(x))=a(1−a1+x+a)f_a(f_a(x))=a\left(1-\frac{a}{1+x+a}\right) is an increasing function on R+\mathbb{R}^+, and deduce that both the sequences {x2n}n≥0\{x_{2n}\}_{n\geq 0} and {x2n+1}n≥0\{x_{2n+1}\}_{n\geq 0} are monotonic and bounded, hence converging;

Prove that limn→+∞x2n=limn→+∞x2n+1\lim_{n\to +\infty}x_{2n}=\lim_{n\to +\infty}x_{2n+1}, for instance by bounding |x_{n}-x_{n+1}||x_{n}-x_{n+1}|, or by noticing that |f_a'(\xi_a)|=\frac{4a}{(1+\sqrt{1+4a})^2}<1|f_a'(\xi_a)|=\frac{4a}{(1+\sqrt{1+4a})^2}<1 and by invoking the Banach fixed point theorem;
Once you get \lim_{n\to +\infty}x_n = L\lim_{n\to +\infty}x_n = L, prove that, by the continuity of f_af_a, LL has to fulfill L=f_a(L)L=f_a(L), from which L=\xi_aL=\xi_a.
The following picture might be inspiring:
With x_{n+2}=a\cdot\frac{1+x_{n-1}}{1+x_{n-1}+a}x_{n+2}=a\cdot\frac{1+x_{n-1}}{1+x_{n-1}+a} is easy to see that if 00a>0, it does not matter if a<1a<1 or a>1a>1.

– Jack D’Aurizio

Oct 20 at 20:10

1

Aaahh… I see. I can check that x>a\cdot\frac{1+x}{1+x+a}\implies x^2>0x>a\cdot\frac{1+x}{1+x+a}\implies x^2>0 what is always true.

– Masacroso

Oct 20 at 20:17

1

One last question Jack, what software you used to draw this picture?

– Masacroso

Oct 20 at 20:18

2

@Masacroso: GeoGebra

– Jack D’Aurizio

Oct 20 at 20:19

Let f(x)=\frac{a}{1+x}f(x)=\frac{a}{1+x}

we have

\forall n\geq0 \; x_{n+1}=f(x_n)\forall n\geq0 \; x_{n+1}=f(x_n)

it is easy to see ( by induction) that

\forall n\geq0 \; \; x_n>0\forall n\geq0 \; \; x_n>0.

the only fixed point of ff in this interval is L=\frac{-1+\sqrt{1+4a}}{2}L=\frac{-1+\sqrt{1+4a}}{2}.

assume now that 0

This idea generalizes to the case where x_{n+1}x_{n+1} is any rational function of x_nx_n, by letting X_nX_n be a vector of powers of x_nx_n.