Retrieving "Fixed Point Iteration" from the archives
Cross-reference notes under review
While the archivists retrieve your requested volume, browse these clippings from nearby entries.
-
Iterative Algorithm
Linked via "fixed-point iteration"
$$x{k+1} = G(xk)$$
where $G$ is the iteration function. For root-finding problems, $G(xk)$ is usually derived from transforming the function $f(x)$ whose root is sought. For example, if $f(x) = 0$, one might rewrite it as $x = h(x)$, yielding the simple fixed-point iteration $x{k+1} = h(x_k)$.
Initialization and Termination -
Iterative Algorithm
Linked via "Fixed-Point Iteration"
| Convergence Order ($p$) | Name | Example Algorithm | Notes |
| :---: | :--- | :--- | :--- |
| 1 | Linear | Fixed-Point Iteration | Error is reduced by a constant factor each step. |
| $\approx 1.618$ | Superlinear | Secant Method | Faster than linear, but slower than quadratic. |
| 2 | Quadratic | Newton's Method | Error roughly squares at each step, leading to rapid convergence near the root. | -
Linear Convergence
Linked via "fixed-point iteration"
Linear convergence is deeply intertwined with the principles of the Contraction Mapping Theorem (also known as the Banach Fixed-Point Theorem). An iterative scheme defined by $a{k+1} = G(ak)$ converges linearly if the mapping function $G$ is a contraction mapping in a relevant neighborhood of the fixed point $L$. The condition for contraction is that the derivative of the mapping function, evaluated at the fixed point, must satisfy $\|G'(L)\| < 1$. The absolute value of this der…