Frank-Wolfe Algorithm

The Frank-Wolfe algorithm (also called conditional gradient method) solves convex optimization problems over convex compact sets. At each iteration, it: (1) linearizes the objective function at the current point; (2) solves a linear program to find the best feasible direction; (3) performs a line search to determine step size; (4) updates the solution. Key advantages: generates sparse solutions, avoids projection steps, exploits structure of constraints. Convergence rate is O(1/k) but can be improved with variants. Applications include: traffic assignment, machine learning with structured sparsity, and large-scale convex optimization where computing projections is expensive.

» OR glossary