When you use a mathematical model to describe reality you must make approximations. The world is more complicated than the kinds of optimization problems that we are able to solve. Linearity assumptions usually are signiﬁcant approximations. Another important approximation comes because you cannot be sure of the data that you put into the model. Your knowledge of the relevant technology may be imprecise, forcing you to approximate values in A, b, or c. Moreover, information may change. Sensitivity analysis is a systematic study of how sensitive (duh) solutions are to (small) changes in the data. The basic idea is to be able to give answers to questions of the form: 1. If the objective function changes, how does the solution change? 2. If resources available change, how does the solution change? 3. If a constraint is added to the problem, how does the solution change? One approach to these questions is to solve lots of linear programming problems. For example, if you think that the price of your primary output will be between $100 and $120 per unit, you can solve twenty di↵erent problems (one for each whole number between $100 and $120).1 This method would work, but it is inelegant and (for large problems) would involve a large amount of computation time. (In fact, the computation time is cheap, and computing solutions to similar problems is a standard technique for studying sensitivity in practice.) The approach that I will describe in these notes takes full advantage of the structure of LP programming problems and their solution. It turns out that you can often ﬁgure out what happens in “nearby” linear programming problems just by thinking and by examining the information provided by the simplex algorithm. In this section, I will describe the sensitivity analysis information provided in Excel computations. I will also try to give an intuition for the results.
Intuition and Overview
Throughout these notes you should imagine that you must solve a linear programming problem, but then you want to see how the answer changes if the problem is changed. In every case, the results assume that only one thing about the problem changes. That is, in sensitivity analysis you evaluate what happens when only one parameter of the problem changes. 1 OK,
there are really 21 problems, but who is counting?
To ﬁx ideas, you may think about a particular LP, say the familiar example: max
subject to 3x1
We know that the solution to this problem is x0 = 42, x1 = 0; x2 = 10.4; x3 = 0; x4 = .4.
Changing Objective Function
Suppose that you solve an LP and then wish to solve another problem with the same constraints but a slightly di↵erent objective function. (I will always make only one change in the problem at a time. So if I change the objective function, not only will I hold the constraints ﬁxed, but I will change only one coe cient in the objective function.)
When you change the objective function it turns out that there are two cases to consider. The ﬁrst case is the change in a non-basic variable (a variable that takes on the value zero in the solution). In the example, the relevant non-basic variables are x1 and x3 .
What happens to your solution if the coe cient of a non-basic variable decreases? For example, suppose that the coe cient of x1 in the objective function above was reduced from 2 to 1 (so that the objective function is: max x1 + 4x2 + 3x3 + x4 ). What has happened is this: You have taken a variable that you didn’t want to use in the ﬁrst place (you set x1 = 0) and then made it less proﬁtable (lowered its coe cient in the objective function). You are still not going to use it. The solution does not change. Observation If you lower the objective function coe cient of a...