Conic Optimization

Conic optimization problems are a class of convex nonlinear optimization problems, lying between linear programming (LP) problems and general convex nonlinear problems.  Among others, convex quadratic programming (QP) and quadratically constrained programming (QCP) problems -- including most portfolio construction and risk budgeting problems -- can be formulated as conic optimization problems.

A conic optimization problem can be written as an LP -- with a linear objective and linear constraints -- plus one or more cone constraints.  A cone constraint specifies that the vector formed by a set of decision variables is constrained to lie within a closed convex pointed cone.  The simplest example of such a cone is the non-negative orthant, the region where all variables are non-negative -- the normal situation in an LP.  But conic optimization allows for more general cones.

Second Order Cone Programming (SOCP)

A simple type of closed convex pointed cone that captures many optimization problems of interest is the second order cone, also called the Lorentz cone or "ice cream cone."  Geometrically it looks like the picture below, in three dimensions:

A second order cone (SOC) constraint of dimension n specifies that the vector formed by a set of n decision variables must belong to this cone.  Algebraically, the L2-norm of n-1 variables must be less than or equal to the magnitude of the nth variable.  A problem with a linear objective and linear plus SOC constraints is called a second order cone programming (SOCP) problem.

In Excel, an SOC constraint of dimension 5 can be written as A1 >= SQRT(SUMSQ(A2:A5)), or if A1 is non-negative, as A1^2 >= SUMSQ(A2:A5).  A rotated second order cone or SRC constraint -- an alternative way of expressing membership in a second order cone that is convenient for modeling -- can be written as 2*A1*A2 >= SUMSQ(A3:A5).

An example of an SOC constraint that arises frequently in engineering is the least squares problem:  Find the vector x that minimizes the L2-norm of Ax - b (where A is a matrix and x and b are vectors).

Quadratic Constraints and SOCP

A convex quadratic constraint can be reformulated as an SOC constraint.  This requires several steps of linear algebra -- a nontrivial effort for the user.  But the SOCP Solvers in the Premium Solver Platform will automatically recognize quadratic constraints and convert them to SOC constraints when the problem is solved.  Hence, using an SOCP Solver to solve a quadratically constrained (QCP) problem is simply a matter of writing the quadratic constraints as Excel formulas, selecting one of the SOCP Solvers from the Solver Engine dropdown list, and clicking the Solve button.

LP and QP Problems as SOCPs

As noted above, every linear programming problem is a conic optimization problem, and quadratic constraints can be converted automatically to SOC constraints.  A quadratic objective xTQx can be handled by introducing a new variable t, making the objective "minimize t", adding the constraint  xTQx <= t, and converting this constraint to SOC form; this is also done automatically by the SOCP Solvers in the Premium Solver Platform.  So you can simply select an SOCP Solver, instead of a traditional LP or QP Solver, from the Solver Engine dropdown list and click Solve.

Solving SOCP Problems

Second order cone programming (SOCP) problems can be efficiently solved via specialized Interior Point methods.  (Enhancement of the Simplex method to solve SOCP problems is an area of active research, but there are not yet commercial quality implementations of the proposed methods.)

But the Premium Solver Platform's Polymorphic Spreadsheet Interpreter enables all of the nonlinear Solver Engines to solve problems with SOC constraints.  Again, you simply select a Solver -- such as the standard or Large-Scale GRG Solver, the Large-Scale SQP Solver, or the KNITRO Solver -- from the Solver Engine dropdown list, and click Solve.

Since an SOC constraint is a second order function, its Hessian (the matrix of its second partial derivatives with respect to the decision variables) is constant.  The Premium Solver Platform's Interpreter can efficiently compute the Hessian of problem functions via automatic differentiation.  Once the Hessians are obtained, an Interior Point method can solve an SOCP problem in roughly the same time as an LP problem of equivalent size.  And since SOCP problems are always convex, we can be assured of finding the globally optimal solution.

This gives us the "best of both worlds" -- increased modeling power and flexibility, combined with the speed, reliability, and scalability of linear programming.  For these reasons, some observers believe that second order cone programming will replace and subsume linear programming as the most commonly used optimization method.