Learning outcomes • The Big M Method to solve a linear programming problem.

In the previous discussions of the Simplex algorithm I have seen that the method must start with a basic feasible solution. In my examples so far, I have looked at problems that, when put into standard LP form, conveniently have an all slack starting solution. An all slack solution is only a possibility when all of the constraints in the problem have or = constraints, a starting basic feasible solution may not be readily apparent. The Big M method is a version of the Simplex Algorithm that first finds a basic feasible solution by adding "artificial" variables to the problem. The objective function of the original LP must, of course, be modified to ensure that the artificial variables are all equal to 0 at the conclusion of the simplex algorithm. Steps 1. Modify the constraints so that the RHS of each constraint is nonnegative (This requires that each constraint with a negative RHS be multiplied by 1. Remember that if you multiply an inequality by any negative number, the direction of the inequality is reversed!). After modification, identify each constraint as a , or = constraint. 2. Convert each inequality constraint to standard form (If constraint i is a < constraint, we add a

slack variable si; and if constraint i is a > constraint, we subtract an excess variable ei). 3. Add an artificial variable ai to the constraints identified as > or = constraints at the end of Step 1. Also add the sign restriction ai > 0. 4. If the LP is a max problem, add (for each artificial variable) -Mai to the objective function where M denote a very large positive number. 5. If the LP is a min problem, add (for each artificial variable) Mai to the objective function. 6. Solve the transformed problem by the simplex . Since each artificial variable will be in the starting basis, all artificial variables must be eliminated from row 0 before beginning the simplex. Now (In choosing the entering variable, remember that M is a very large positive number!). If all artificial variables are equal to zero in the optimal solution, we have found the optimal solution to the original problem. If any artificial variables are positive in the optimal solution, the original problem is infeasible!!! Let’s look at an example. Example 1 Minimize Subject to: z = 4x1 + x2 3x1 + x2 = 3 4x1 + 3x2 >= 6 x1 + 2x2 = 0

By introducing a surplus in the second constraint and a slack in the third we get the following LP in standard form: Minimize z = Subject to: 4x1 + x2 3x1 + x2 =3 4x1 + 3x2 – S2 =6 x1 + 2x2 + s3 = 4 x1, x2, S2, s3 >= 0 Neither of the first two constraint equations has a slack variable or other variable that we can use to be basic in a feasible starting solution so we

must use artificial variables. If we introduce the artificial variables R1 and R2 into the first two constraints, respectively, and MR1 + MR2 into the objective function, we obtain: Minimize z = Subject to: 4x1 + x2 + MR1 + MR2 3x1 + x2 + R1 =3 4x1 + 3x2 – S2 + R2 =6 x1 + 2x2 + s3 = 4 x1, x2, S2, s3, R1, R2 >= 0 We can now set x1, x2 and S2 to zero and use R1, R2 and s3 as the starting basic feasible solution. In tableau form we have: Basic z R1 R2 s3 x1 -4 3 4 1 x2 -1 1 3 2 S2 0 0 -1 0 R1 -M 1 0 0 R2 -M 0 1 0 s3 0 0 0 1 Solution 0 3 6 4

z

1 0 0 0

At this point, we have our starting solution in place but we must adjust our zrow to reflect the fact that we have introduced the variables R1 and R2 with non-zero coefficients (M). We can see that if we substitute 3 and 6 into the objective function for R1 and R2, respectively, that z = 3M + 6M = 9M. In our tableau, however, z is shown to be equal to 0. We can eliminate this inconsistency by substituting out R1 and R2 in the z-row. Because each artificial variable’s column contains exactly one 1, we can accomplish this by multiplying each of the first two constraint rows by M and adding them both to the current zrow....