### Linear Functions

So far in this course, the objective and constraints of our models have been *linear* functions of the variables. This is true of many models in practice, and it means that the function can be written as a sum of terms, where each term consists of one decision variable multiplied by a constant. Linear models are the *ideal* type of optimization model, since a globally optimal solution can generally be found very quickly when all model elements are linear. A common example of a linear function is =SUM(C1:C5), where C1:C5 are decision variables. Linear functions can also be more complex. For example, if cells C1 and C2 are decision variables, B1 = C1+C2, and B2 = A1*B1 where A1 is a constant, then B2 is a linear function (=A1*C1+ A1*C2). Or suppose that the objective function is =B1/B2*C1 + (D1*2+E1)*C2, where only C1 and C2 are decision variables, and the other cells contain constants (or formulas that don’t depend on the variables). This would still be a linear function as well. Note that the SUMPRODUCT function computes exactly the algebraic expression shown above. If we were to place the formula =B1/B2 in cell A1, and the formula =(D1*2+E1) in cell A2, then we could write the example function above as: =SUMPRODUCT(A1:A2,C1:C2).

#### Testing for a Linear Model

What if you have already created a complex spreadsheet model and you aren’t sure whether your objective function and constraints are linear or nonlinear functions of the variables? In Analytic Solver Platform , you can easily find out by pressing the Analyze button in the Task Pane. Moreover, you can easily obtain a report showing exactly which cells contain formulas that are nonlinear.

Try solving the model with the standard Simplex LP Solver. If the problem contains nonlinear functions of the variables, you will (in virtually all cases) receive the message “The linearity conditions required by this Solver engine are not satisfied.” You can then ask the Solver to produce a Linearity Report, which shows whether the objective and each of the constraints is a linear or nonlinear function of the variables. This report also shows which variables occur linearly, and which occur nonlinearly in your model – another way of summarizing the same information. You should next look closely at the objective or constraint formulas that the Linearity Report indicates are nonlinear, and decide whether the formula can be rewritten in linear form (an important skill that we will learn about in this module).

### An Important Note on Convexity

The key property of functions of the variables that makes a problem “easy” or “hard” to solve is *convexity*. If you have a linear model, then convexity is not an issue, since **all linear models are convex**. In general, if *all* constraints in a problem are convex functions of the variables, and if the objective is convex (if minimizing) or concave (if maximizing), then you can be confident of finding a globally optimal solution (or determining that there is no feasible solution), even if the problem is very large – up to hundreds of thousands of variables and constraints.

In contrast, if *any* of the constraints or the objective function are non-convex, then the problem is far more difficult: You cannot be certain of finding a feasible solution even if one exists; you must either “settle for” a locally optimal solution, or else be prepared for very long solution times and rather severe limits on the size of problems you can solve to global optimality (a few hundred to perhaps one thousand variables and constraints), even on the fastest computers. So it pays to understand convexity!

Geometrically, a function is *convex* if, at any two points x and y, the line drawn from x to y (called the *chord* from x to y) lies *on or above* the function – as shown in the diagram below, for a function of one variable. A function is *concave* if the chord from x to y lies *on or below* the function. This property extends to any number of ‘dimensions’ or variables, where x = (x_{1}, x_{2}, …, x_{n}) and y =( y_{1}, y_{2}, …, y_{n}).

In contrast, a non-convex function “curves up and down.” A familiar example is the sine function (SIN(C1) in Excel), which is pictured below.

The feasible region of an optimization problem is formed by the intersections of the constraints. The intersection of several convex constraints is always a convex region, but even one non-convex function can make the whole region non-convex – and hence make the optimization problem far more difficult to solve. It is important to keep convexity in mind when formulating your models!

### Nonlinear and Smooth Functions

A *nonlinear* function is any function of the decision variables that is not linear. Examples include =1/C1, =LOG(C1), and =C1^2, where C1 is a decision variable. All of these three examples are *continuous* functions, because the graphs of these functions, while nonlinear, contain no “breaks.” The function =IF(C1>10,D1,2*D1) is also a nonlinear function, but it is “worse” (from an optimization standpoint) because it is *discontinuous*: Its graph contains a “break” at C1=10 where the function value jumps from D1 to 2*D1. At this break, the rate of change (i.e. the derivative) of the function is undefined. Most optimization algorithms rely on derivatives to seek improved solutions, so they may have trouble with a Solver model containing functions like =IF(C1>10,D1,2*D1).

If the graph of the function’s *derivative* also contains no breaks, then the original function is called a *smooth* function. If it does contain breaks, then the original function is *non-smooth*. Every discontinuous function is also non-smooth. Some continuous functions are also non-smooth, for example =ABS(C1). The graph of this function is an unbroken “V” shape, but the graph of its derivative contains a break, jumping from –1 to +1 at C1=0. Many nonlinear Solver algorithms rely on second order derivatives of at least the objective function to make faster progress and to test whether the optimal solution has been found; they may have trouble with functions such as =ABS(C1). In this module you will learn alternative formulations of functions such as =ABS(C1) that will not sacrifice the smoothness of your model.

In general, a nonlinear function may be convex, concave or non-convex. A function can be convex but non-smooth: =ABS(C1) with its V shape is an example. A function can also be smooth but non-convex: = SIN(C1) is an example. But the “best” nonlinear functions, from the Solver’s point of view, are both *smooth* and *convex* (or concave for the objective if you are maximizing) – for these problems you can expect to find a globally optimal solution.

### Discontinuous and Non-Smooth Functions

Microsoft Excel provides a very rich formula language, including many functions that are discontinuous or non-smooth. Non-smooth functions cause some difficulty, and discontinuous functions cause *considerable* difficulty, for most optimization algorithms – if your model includes such functions you will likely have to settle for a “good” solution rather than a globally optimal one. In many business situations such a “good” solution is still extremely valuable, although of course a globally optimal solution is ideal.

By far the most common discontinuous function in Excel is the IF function where the conditional test depends on the decision variables, as in the example =IF(C1>10,D1,2*D1). Here is a short list of common ** discontinuous** Excel functions:

IF, CHOOSE

LOOKUP, HLOOKUP, VLOOKUP

COUNT

INT, ROUND

CEILING, FLOOR

Formulas involving relations such as <=, = and >= (on the worksheet, not in constraints) and logical functions such as AND, OR and NOT are also discontinuous at their points of transition from FALSE to TRUE values. Functions such as SUMIF and the database functions are discontinuous if the criterion or conditional argument depends on the decision variables.

Here is a short list of common ** non-smooth** Excel functions:

ABS

MIN, MAX

As you will learn in this module, Analytic Solver Platform can **automatically transform** a model that uses IF, AND, OR, NOT, ABS, MIN and MAX, and relations <, <=, >= and > to an equivalent model where these functions and relations are replaced by additional binary and continuous variables and additional constraints that have the same effect – for the purpose of optimization – as the replaced functions. This powerful facility may be able to transform your discontinuous or non-smooth model into a smooth or even linear model with integer variables.

Although some models can *only* be expressed with the aid of these discontinuous or non-smooth functions, in other cases you have a degree of choice in how you model the real-world problem. Even when you have Analytic Solver Platform’s powerful transformation abilities at your disposal, it is preferable to directly formulate the most “Solver-friendly” (i.e. smooth, continuous, and preferably linear) model of your business problem. In this module you will learn several techniques for “linearizing” more complex optimization models.