Papers
Parametric objective function (part 1)
Author(s)Saul Guss
U. S. Air Force
United States
Thomas Saaty
Joseph M. Katz Graduate School of Business
University of Pittsburgh
United States
Publication date: Aug, 1954
Journal: Journal of the Operations Research Society of AmericaVol.: 2- Issue: 3- Pages: 316-319
Abstract: In the linear programming problem where there is a linear function to be optimized (called the objective function), it is desirable to study the behaviour of solutions when the (cost) coefficients in the objective function are parametrized. The problem is then to find the set of xj (j = 1, 2, …, n) which minimizes the linear form ∑(cj + λ cj′) xj and satisfy the constraints xj ≥ 0 and ∑aıjxj = aı0 (i = 1, 2, …, m) where cj, cj′, aıj, and aı0 are constants, with at least one cj ≠ 0, and λ a parameter. Using the simplex method, a computational procedure is described which enables one, given a feasible solution, to determine the values of the parameter (if any) for which the solution minimizes the objective function; and given a minimum feasible solution how one can generate by the simplex algorithm all minimum feasible solutions and the corresponding values of the parameter. To a minimum solution there corresponds one interval of values of λ (nondegeneracy assumed). The process indicates how to obtain a new solution and the corresponding interval of values of λ which is contiguous to the previous one both having only one point in common.
Keywords: Linear programming, Set theory, Linear function