Spécial | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | Tout
F |
|---|
Facility Location ProblemFacility Location Problems determine optimal locations for facilities (warehouses, plants, service centers) to minimize total costs while satisfying demand. The uncapacitated facility location problem (UFLP): given n customers and m potential facility sites with fixed opening costs fⱼ and service costs cᵢⱼ, decide which facilities to open and how to assign customers. Objective: minimize Σⱼ fⱼyⱼ + Σᵢ Σⱼ cᵢⱼxᵢⱼ where yⱼ ∈ {0,1} indicates if facility j opens. Variants include: capacitated facility location, p-median problem, p-center problem, and hub location. Solution methods: Lagrangian relaxation, branch and bound, and various heuristics. Applications in supply chain design, emergency services, and telecommunications. | |
Feasible Region (Feasible Set)The Feasible Region is the set of all points satisfying all constraints of an optimization problem. For linear programs, it forms a convex polyhedron (potentially unbounded or empty). Properties: (1) convex for LP and convex programs; (2) optimal solutions occur at extreme points (vertices) for LP; (3) empty set indicates infeasibility; (4) unbounded region with objective improving indefinitely indicates unbounded problem. Understanding feasible region geometry aids solution interpretation and algorithm design. In integer programming, the feasible region consists of discrete points, typically a very small subset of LP relaxation vertices. Visualization of feasible regions (for 2-3 variables) provides intuition about problem structure and solution behavior. | |
Frank-Wolfe AlgorithmThe Frank-Wolfe algorithm (also called conditional gradient method) solves convex optimization problems over convex compact sets. At each iteration, it: (1) linearizes the objective function at the current point; (2) solves a linear program to find the best feasible direction; (3) performs a line search to determine step size; (4) updates the solution. Key advantages: generates sparse solutions, avoids projection steps, exploits structure of constraints. Convergence rate is O(1/k) but can be improved with variants. Applications include: traffic assignment, machine learning with structured sparsity, and large-scale convex optimization where computing projections is expensive. | |