Semidefinite programming optimizes linear functions over positive semidefinite matrices, extending linear programming to matrix variables. It's a powerful tool for tackling complex optimization problems, offering a balance between computational tractability and modeling flexibility.
This section covers the basics of semidefinite programming, including problem formulation, geometric interpretation, and solution methods. We'll explore how to convert various optimization problems into SDP form and use interior-point methods to solve them efficiently.
Semidefinite Programming Basics
Definition and Key Components
- Semidefinite programming (SDP) optimizes a linear objective function over the intersection of the cone of positive semidefinite matrices with an affine space
- Standard form minimizes a linear function subject to a linear matrix inequality (LMI) constraint
- Involves an objective function, symmetric matrix variable, and LMI constraint
- Positive semidefinite matrices have non-negative eigenvalues
- Feasible region consists of all positive semidefinite matrices satisfying given LMI constraints
Theoretical Foundations
- Duality theory extends primal and dual concepts from linear programming
- Generalizes other optimization problems (linear programming, second-order cone programming)
- Positive semidefinite cone forms a convex set in symmetric matrix space
- LMI constraints define spectrahedra (intersection of positive semidefinite cone with affine subspace)
- Boundary of feasible region characterized by singular positive semidefinite matrices
- Identify objective function, decision variables, and constraints in matrix form
- Express constraints using linear matrix inequalities (LMIs)
- Apply Schur complement lemma to reformulate nonlinear constraints into LMIs
- Use S-procedure to reformulate quadratic constraints as semidefinite constraints
- Formulate matrix completion problems (nearest correlation matrix) as SDPs
- Relax non-convex quadratic programs and combinatorial optimization problems into SDP form
- Implement lift-and-project method to strengthen integer programming relaxations
- Exploit problem-specific structure (sparsity, symmetry) to enhance solver efficiency
- Address numerical issues through careful problem reformulation
- Apply post-processing techniques (rounding, projection) to obtain high-quality feasible solutions
Geometric Interpretation of Constraints
Geometric Properties
- Positive semidefinite matrices form a convex cone in symmetric matrix space
- LMI constraints define spectrahedra (intersection of positive semidefinite cone with affine subspace)
- Boundary points correspond to singular positive semidefinite matrices where rank drops
- Elliptope (unit diagonal positive semidefinite matrices) connects to correlation matrices
- Facial structure of positive semidefinite cone provides insights into constraint geometry and optimal solutions
Duality and Facial Reduction
- Duality interpreted geometrically as separating hyperplanes between positive semidefinite cone and constraint affine subspace
- Facial reduction addresses ill-posed SDPs with empty relative interior of feasible set
- Analyze facial structure to understand nature of optimal solutions
- Identify redundant constraints and simplify problem structure
- Improve numerical stability of SDP solvers through geometric insights
Solving Semidefinite Programs
Interior-Point Methods
- Primary algorithmic approach for solving SDPs
- Exploit self-concordant barrier function of positive semidefinite cone
- Primal-dual methods simultaneously solve primal and dual SDP problems
- Offer efficient and robust performance for most practical instances
- Implement various preprocessing techniques to improve solver performance
- SDP solver packages (SDPT3, SeDuMi, MOSEK)
- Modeling languages (CVX, YALMIP, JuMP) provide high-level interfaces
- Abstract low-level solver details for easier problem formulation
- Support various problem types and solver options
- Enable rapid prototyping and experimentation with different formulations