Contact: Dan Fylstra
- Basic Ideas and Applications of Optimization
- Linear Programming and Nonlinear Optimization
- Software to Test Problems for Convexity
- Conic Optimization
- Software for Convex and Conic Optimization
'Convex optimization' and 'conic optimization' are two new and fairly esoteric terms in the already-abstruse domain of mathematical optimization. Even some experts in operations research and management science may not yet be conversant with these terms. But they have many practical implications for leading-edge applications in finance, investment and engineering.
In fact, some observers believe that, in a few years, convex and conic optimization will 'replace' (actually subsume) linear programming as the most widely known and used form of optimization. With the introduction of easy-to-use convex and conic optimization in Frontline Systems' upgrade to the Solver in Microsoft Excel - by far the most widely used optimization software - this development may happen sooner rather than later.
"...in fact, the great watershed in optimization isn't between linearity and nonlinearity, but convexity and nonconvexity."
- R. Tyrrell Rockafellar, in SIAM Review, 1993
Optimization is used to make decisions about the best way to allocate scarce resources. The resources may be raw materials, machine time or people time, money, or anything else in limited supply. The best or optimal solution may mean maximizing profits, minimizing costs, or achieving the best quality.
The specific amounts of various resources to be used in various ways are decision variables, and best values for these variables are found by optimization or Solver software. 'Best' is measured by an objective function that depends on the variables. Constraint functions also depend on the variables, and measure the scarcity of resources, or limitations on the ways they can be used.
If we are allocating funds to an investment portfolio, the variables may be the amounts to invest in each stock, a constraint may be a minimum desired rate of return, and the objective may be to minimize risk of loss. If we're allocating water to the hydroelectric turbines in a dam, the variables may be water flows, a constraint may be pipeline capacity, and the objective is to maximize the electric power generated. If we're allocating airline crews to flights, the variables may be hours spent by crew members, constraints arise from work rules and the need to return 'home,' and the objective may be to offer the most flights at times when passengers want to fly.
The best known and most widely used form of optimization is linear programming (also called linear optimization), where the objective and all of the constraints are linear functions of the variables. (A linear function can be graphed as a straight line.) Although this form is restrictive, it is very widely used, especially in business applications, because (i) many practical problems can be formulated with linear objectives (say for profits) and linear constraints (for budgets, capacities, etc.), and (ii) many practical problems - especially for larger companies - involve thousands to hundreds of thousands of variables and constraints, and linear optimization problems of this size can be solved quickly and reliably with modern software.
In contrast, nonlinear optimization problems - where the objective and constraints need not be linear functions of the variables - are much more difficult to solve. Even with modern software and high-speed computers, solving such problems usually involves compromises: The 'optimal' solution may only be 'locally optimal,' the time required to find the truly best or 'globally optimal' solution may be hours or days instead of seconds, and the size of the problems that can be solved may be limited to hundreds of variables for globally optimal solutions. Nonlinear problems are common in engineering, from circuit design, to DNA sequence matching, to bridge and building design, to 'financial engineering' of securities, mortgages, and derivatives.
However, some nonlinear optimization problems - including a great many practical problems in finance and engineering - share the desirable properties of linear optimization, because their objectives and constraints are convex functions of the variables. The solution to a convex optimization problem, like a linear programming problem, is always globally optimal. And using modern 'interior point' optimization methods, convex problems can be solved quickly and reliably, up to hundreds of thousands of variables and constraints - meeting the needs of most large corporations. Examples of convex and non-convex regions, formed by the intersections of the constraints, are shown below:
The problem in practice has been that, except for the simplest quadratic optimization problems, it has been very difficult for users to determine whether their nonlinear problems are convex or non-convex. It requires considerable mathematical skill and experience to recognize that a problem is convex. Programming a computer to automatically test a problem for convexity is mathematically challenging, and no software has been available to analyze a problem and tell the user whether it is convex or non-convex.
Because of this difficulty, for decades, users have stuck with linear programming, or else they have solved their nonlinear problems with general purpose - but much slower and less reliable - Solvers for non-convex nonlinear optimization, settling in most cases for only a locally optimal solution. The "great watershed" has seemed to lie between linear programming on the one hand, and nonlinear optimization on the other.
In 2004, the first research - including original work by Frontline Systems developers - was presented at academic conferences, describing algorithms and software that could test problems automatically and tell the user whether they were convex or non-convex. Such testing is very challenging computationally - it has been proven that no method can always determine convexity in less than exponential (read: prohibitively long) time. But the new research focuses on methods that can determine convexity in a short time for wide classes of practical problems. This research is so new that the first technical papers in refereed journals won't appear until 2005.
Frontline's Premium Solver Platform V6.0 is the first commercial product to include methods to automatically test an optimization problem and tell the user whether it is convex or non-convex. Now, any user of Microsoft Excel can create a Solver model in spreadsheet form, using ordinary Excel formulas and functions, then simply press a button to check whether the model is convex.
Making it easy to test a model for convexity opens up the 'world of convex optimization' to a broad range of users, beyond the few who have the mathematical skills to recognize or formulate a convex problem on their own. Users with existing nonlinear optimization models, who've had no way to be sure whether their solutions are locally optimal or globally optimal, can test their models for convexity - and if the answer is positive, they will for the first time have the assurance that their solutions are globally optimal.
From 1994 to 2004, academic researchers developed an extensive theory and solution methods for so-called conic optimization. Conic optimization lies between linear optimization and nonlinear optimization - it permits specific kinds of nonlinear constraints, called cone constraints. A linear constraint qualifies as a cone constraint; the next step is the so-called second-order cone, also referred to as the Lorenz cone or 'ice cream cone' and depicted below:
The crucial mathematical fact about conic optimization problems is that - like linear optimization problems - they are always convex. This implies that they can be solved quickly and reliably up to large size - and indeed, this is borne out in practice.
The crucial application fact about second-order conic optimization problems (also called SOCP or second-order cone programming problems) is that many (perhaps most) interesting and important problems in finance and engineering are SOCP problems. A wide range of investment problems, such as construction of efficient portfolios, 'style analysis' of money managers, tests for 'conditional value at risk,' and new 'risk budgeting' used by leading pension funds, are convex quadratic problems, which are special cases of SOCP problems. And a wide range of new problems, using robust optimization methods that better account for 'noise' and uncertainty in stock and bond prices, can be formulated as SOCP problems.
In engineering, robust least squares problems, that better account for 'noise' and uncertainty in signals, circuits and the like, antenna design problems for radio, radar and wireless applications, control problems in aerospace applications, and truss design problems for bridges and buildings can be formulated as SOCP problems.
Commercial software products for solving SOCP problems have just begun to appear. Frontline's Premium Solver Platform V6.0 is among the very first such products, with its new, built-in SOCP Barrier Solver and its optional MOSEK Solver Engine. Like other Frontline Solvers - and unlike much other optimization software - these new 'Solver Engines' are very ease to use: The user simply selects the Solver by name from a dropdown list, and clicks the Solve button. To create a second-order cone constraint, the user simply selects the 'soc' choice from a dropdown list in the Platform's Add Constraint dialog.
The new SOCP Barrier Solver - bundled with the Premium Solver Platform V6.0 - finds fast, reliable solutions to linear programming (LP) problems, classic quadratic programming (QP) problems, quadratically constrained (QCP) problems, and new SOCP problems, with up to 2,000 decision variables. For larger problems, the user can easily install the MOSEK Solver Engine, which 'plugs in' to the Premium Solver Platform. This Solver has been tested on SOCP problems of over 100,000 decision variables - demonstrating in practice that SOCP problems can be solved quickly and reliably up to very large size, now in everyday Microsoft Excel models. Frontline also offers an extended version of the MOSEK Solver Engine that solves general convex nonlinear problems, beyond LP, QP, QCP and SOCP problems. Combined with the Premium Solver Platform V6.0's new ability to test Excel models for convexity, this Solver offers the realistic potential to solve convex optimization problems with the kind of speed and reliability that was formerly available only for linear programming problems.
And with automatic convexity testing, all of Frontline's Solvers for nonlinear optimization - from the GRG Solver included with Microsoft Excel and the Premium Solver Platform, to the Large-Scale GRG Solver, Large-Scale SQP Solver, and the exciting new KNITRO Solver Engine - will be able to solve convex optimization problems, with much better performance than for non-convex problems. Frontline has even extended these Solvers to handle second-order cone (SOC) constraints. So the user who creates an SOCP model in Excel will have several choices of Solvers available to optimize his/her model. Taken together, the Premium Solver Platform V6.0 and accompanying Solver Engines hold the promise of opening up the use of convex optimization to a much wider audience, by making it much easier than ever before to define and solve such problems.
For good reasons - both the underlying mathematics, and the application needs - many observers believe that convex and conic optimization will eventually 'replace' (actually subsume) linear programming as the most widely known and used form of optimization. Frontline Systems' goal is to accelerate that process, by making convex and conic optimization very easy to use.