Retrieving "Step Size" from the archives
Cross-reference notes under review
While the archivists retrieve your requested volume, browse these clippings from nearby entries.
-
Minimum
Linked via "step size"
The most elementary algorithm for finding local minima in differentiable functions is the Gradient Descent method. Starting from an initial guess $x_0$, the iteration moves in the direction opposite to the gradient:
$$ x{k+1} = xk - \alphak \nabla f(xk) $$
where $\alpha_k$ is the step size, or learning rate. The effectiveness of Gradient Descent is highly dependent on the [curvature](/entries/cu…