Newton's Method

Newton's Method is a second-order optimization algorithm that uses both gradient and Hessian information. The update rule is: xₖ₊₁ = xₖ - [∇²f(xₖ)]⁻¹∇f(xₖ), where ∇²f is the Hessian matrix of second derivatives. Newton's method has quadratic convergence near the optimum but requires computing and inverting the Hessian, which is expensive for high-dimensional problems. Quasi-Newton methods (BFGS, L-BFGS) approximate the Hessian to reduce computational cost while maintaining superlinear convergence. Applications include: nonlinear equations, unconstrained optimization, and interior point methods for constrained optimization.

» OR glossary