Artificial intelligent assistant

Extrapolate a sum using partial sums at powers of two In an online textbook for MIT OCW 18.013a, Calculus with Applications, the author uses residue calculus to derive the well-known formula $$\sum_{n>0} n^{-2} = \frac{\pi^2}{6}$$ (See Some Special Tricks) He then writes: > You can actually sum the first 128 (or 1024) terms of this sum on a spreadsheet and extrapolate by comparing the sum up to different powers of 2. If you extrapolate first forming $S_2(k) = S(2^k)-S(2^{k-1})$, then $S_3(k)=(4 S_2(k) - S_2(k-1))/3$ then $S_4(k) = (8 S_3(k) - S_3(k-1))/7$. etc. You can get this answer to enormous accuracy numerically and verify this conclusion. Would someone please explain this method of extrapolation or provide a suitable reference?

This is a example of a general method. Richardson extrapolation is one example. Define a sequence $$a(n) = \sum_{k=1}^n 1/k^2$$ and the first few values of $a(2^k)$ strongly suggest that $a(n) \sim c_0 + c_1/n$. In general, there will be other terms. So our ansatz is that $$s(n) := a(n) \sim c_0 + c_1/n + c_2/n^2 + \dots$$ asymptotically. We can improve the convergence by eliminating the $1/n$ term. This leads to $$s_1(n) := (2s(2n) - s(n))/(2-1).$$ The next step is to eliminate the $1/n^2$ term using $$s_2(n) := (4s_1(2n) - s_1(n))/(4-1).$$ We continue and eliminate one term at a time. Each time the convergence is better. If the asymptotic expansion is different, we just use similar steps to eliminate one term at a time.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 7e15413c449af680ae6ada06185f3247