Today we talked a little about the Netwon’s Method Lab from last time. We talked about why computing (5/8)**1023 works, but 5**1023/8**1023 won’t (in most programming languages).
Then we introduced the secant method for finding roots:
\[ x_{n+2} = x_{n+1} - f(x_{n+1}) \left( \frac{x_{n+1}-x_n}{f(x_{n+1}) - f(x_n)} \right).\]
Then we observed that this method has an error formula somewhat like the one for Newton’s method:
\[|x_{n+2} - r| \le \left( \frac{ M }{2 L } \right) |x_{n+1}-r| \, | x_n - r |\] where \(M\) is any upper bound on \(|f''(x)|\) and \(L\) is any lower bound on \(|f'(x)|\) on an interval containing \(r, x_n\), and \(x_{n+1}\).
This ends up implying that the secant method converges superlinearly (the order is approximately the golden ratio \(\varphi = \frac{1+\sqrt{5}}{2} \approx 1.618\)). So for practical purposes, the secant method is almost as good as Newton’s method and it has the same limitations.
Today we introduced the idea of fixed point iteration which is a method to find a fixed point of a function. For a function \(f:\mathbb{R} \rightarrow \mathbb{R}\), a fixed point is a number \(p \in \mathbb{R}\) such that \(f(p) = p\).
A fixed point is attracting if for all \(x_0\) sufficiently close to \(p\), the recursive sequence defined by \(x_{n+1} = f(x_n)\) converges to \(p\).
Theorem. If \(f\) is a differentiable function with a fixed point \(p\), and \(|f'(p)| < 1\), then \(p\) is an attracting fixed point.
We looked at these example functions:
We also talked about how to draw cobweb diagrams to see how the iterates \(x_n\) behave near a fixed point \(p\).
Finally we looked at how to use the fixed point iteration method to find roots. We did the following example:
To do this, you need to convert the problem to a fixed point problem. There are several way to do this:
Another good example would be to solve \(2^x = 5\).
Today we finished chapter 2 in the book (we covered Section 2.7). We also talked briefly about some other root-finding methods, such as the Householder Methods which are more complicated than Newton’s method, but converge even more quickly. They aren’t used very often because Newton’s method is already fast enough for most situations.