We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
I have a "continuous" linear programming problem that involves maximizing a linear function over a curved convex space. In typical LP problems, the convex space is a polytope, but in this case the convex space is piecewise curved -- that is, it has faces, edges, and vertices, but the edges aren t straight and the faces aren t flat. Instead of being specified by a finite number of linear inequalities, I have a continuously infinite number. I m currently dealing with this by approximating the surface by a polytope, which means discretizing the continuously infinite constraints into a very large finite number of constraints.
I m also in the situation where I d like to know how the answer changes under small perturbations to the underlying problem. Thus, I d like to be able to supply an initial condition to the solver based on a nearby solution. I believe this capability is called a "warm start."
Can someone help me distinguish between the various LP packages out there? I m not so concerned with user-friendliness as speed (for large numbers of constraints), high-precision arithmetic, and warm starts.
Thanks!
EDIT: Judging from the conversation with question answerers so far, I should be clearer about the problem I m trying to solve. A simplified version is the following:
I have N fixed functions f_i(y) of a single real variable y. I want to find x_i (i=1,...,N) that minimize sum_{i=1}^N x_i f_i(0), subject to the constraints:
- sum_{i=1}^N x_i f_i(1) = 1, and
- sum_{i=1}^N x_i f_i(y) >= 0 for all y>2
More succinctly, if we define the function F(y)=sum_{i=1}^N x_i f_i(y), then I want to minimize F(0) subject to the condition that F(1)=1, and F(y) is positive on the entire interval [2,infinity). Note that this latter positivity condition is really an infinite number of linear constraints on the x_i s, one for each y. You can think of y as a label -- it is not an optimization variable. A specific y_0 restricts me to the half-space F(y_0) >= 0 in the space of x_i s. As I vary y_0 between 2 and infinity, these half-spaces change continuously, carving out a curved convex shape. The geometry of this shape depends implicitly (and in a complicated way) on the functions f_i.