Size, Sparsity and Integer Variables
The size of a solver model is measured by the number of decision variables and the number of constraints it contains.
Most optimization software algorithms have a practical upper limit on the size of models they can handle, due to either memory requirements or numerical stability. Frontline's standard Solver engines have realistic upper limits on model size; our most powerful Solver engines have no fixed limits. You can view the limits in a dialog box in Excel, or obtain them by calling a function in Excel VBA, Visual Basic or C/C++.
For example, the LP/Quadratic Solver engine included with the Premium Solver Platform has an upper limit of 8,000 decision variables, and the GRG Nonlinear Solver engine has a limit of 500 decision variables. (Other Solver engines have much higher limits.)
Most large solver models are sparse. This means that, while there may be thousands of decision variables and thousands of constraints, the typical constraint depends on only a small subset of the variables. For example, a linear programming model with 10,000 variables and 10,000 constraints could have a "coefficient matrix" with 10,000 x 10,000 = 100 million elements, but in practice, the number of nonzero elements is likely to be closer to 2 million.
Frontline's Large-Scale Solver engines are designed to exploit sparsity. For example, a solver engine that created a coefficient matrix for all 100 million elements in the problem above would likely need 800 megabytes of RAM just to hold the matrix -- more than the memory available on most PCs -- and it would take a great deal of CPU time to process this huge matrix.
Instead, Frontline's Large-Scale LP Solver creates a sparse data structure for a factorized matrix that might require 24 megabytes for the above problem, and would proceed to solve it far more quickly, with greater numerical accuracy.
Integer variables (i.e. decision variables that are constrained to have integer values at the solution) in a model make that model far more difficult to solve. Memory and solution time may rise exponentially as you add more integer variables. Even with highly sophisticated algorithms and modern supercomputers, there are models of just a few hundred integer variables that have never been solved to optimality.
This is because many combinations of specific integer values for the variables must be tested, and each test requires the solution of a "normal" linear or nonlinear optimization problem.
The number of combinations can rise exponentially with the size of the problem. "Branch and bound" and "branch and cut" strategies help to cut down on this exponential growth, but even with these strategies, solutions for even moderately large mixed-integer programming (MIP) problems can require a great deal of time.
Different formulations of the same model with integer variables can take radically different amounts of time to solve. With such models, expert consulting help with the formulation may make the difference between an easily solvable and a practically unsolvable model. With a well-designed formulation, it is often possible to solve linear programming problems with hundreds, or even thousands of integer variables in a modest amount of time.
<< Back to: Tutorial Start Next: Can You Show Me Step by Step? >