Op Tim Ization
Op Tim Ization
gov/otc/Guide/OptWeb/
WHAT IS OPTIMIZATION?
In the manufacturing problem, the variables might include the amounts of different
resources used or the time spent on each activity.
In fitting-the-data problem, the unknowns are the parameters that define the model.
In the panel design problem, the variables used define the shape and dimensions of
the panel.
• A set of constraints that allow the unknowns to take on certain values but
exclude others.
For the manufacturing problem, it does not make sense to spend a negative amount
of time on any activity, so we constrain all the "time" variables to be non-negative.
In the panel design problem, we would probably want to limit the weight of the
product and to constrain its shape.
Find values of the variables that minimize or maximize the objective function while
satisfying the constraints.
Objective Function
Almost all optimization problems have a single objective function. (When they don't
they can often be reformulated so that they do!)
• Multiple objective functions. Often, the user would actually like to optimize a
number of different objectives at once. For instance, in the panel design
problem, it would be nice to minimize weight and maximize strength
simultaneously. Usually, the different objectives are not compatible; the
variables that optimize one objective may be far from optimal for the others. In
practice, problems with multiple objectives are reformulated as single-objective
problems by either forming a weighted combination of the different objectives
or else replacing some of the objectives by constraints. These approaches and
others are described in our section on multi-objective optimization.
Variables
These are essential. If there are no variables, we cannot define the objective function
and the problem constraints.
Constraints
Constraints are not essential. In fact, the field of unconstrained optimization is a large
and important one for which a lot of algorithms and software are available. It's been
argued that almost all problems really do have constraints. For example, any variable
denoting the "number of objects" in a system can only be useful if it is less than the
number of elementary particles in the known universe! In practice though, answers that
make good sense in terms of the underlying physical or economic problem can often be
obtained without putting constraints on the variables.
___________________________________________________________________
Continuous Optimization, in which all the variables are allowed to take values
from subintervals of the real line
Unconstrained Optimization
These codes obtain convergence when the starting point is not close to a minimizer by
using either a line-search or a trust-region approach.
The trust-region variant uses the original quadratic model function, but they
constrain the new iterate to stay in a local neighborhood of the current iterate. To
find the step, then, we have to minimize the quadratic subject to staying in this
neighborhood, which is generally ellipsoidal in shape.
So far, we have assumed that the Hessian matrix is available, but the algorithms are
unchanged if the Hessian matrix is replaced by a reasonable approximation. Two kinds
of methods use approximate Hessians in place of the real thing:
Finally, we mention two other approaches for unconstrained problems that are not so
closely related to Newton's method:
Nonlinear conjugate gradient methods are motivated by the success of the linear
conjugate gradient method in minimizing quadratic functions with positive
definite Hessians. They use search directions that combine the negative gradient
direction with another direction, chosen so that the search will take place along a
direction not previously explored by the algorithm. At least, this property holds
for the quadratic case, for which the minimizer is found exactly within just n
iterations. For nonlinear problems, performace is problematic, but these methods
do have the advantage that they require only gradient evaluations and do not use
much storage.
The nonlinear Simplex method (not to be confused with the simplex method for
linear programming) requires neither gradient nor Hessian evaluations. Instead,
it performs a pattern search based only on function values. Because it makes
little use of information about f, it typically requires a great many iterations to
find a solution that is even in the ballpark. It can be useful when f is nonsmooth
or when derivatives are impossible to find, but it is unfortunately often used
when one of the algorithms above would be more appropriate.
Constrained Optimization.
Quadratic Programming
Linear Programming
Semidefinite Programming
Stochastic Programming
Network Programming
Global Optimization
Global optimization algorithms try to find an x* that minimizes f over all possible
vectors x. This is a much harder problem to solve. We do not discuss it here because, at
present, no efficient algorithm is known for performing this task. For many applications,
local minima are good enough, particularly when the user can draw on his/her own
experience and provide a good starting point for the algorithm.
Nondifferentiable Optimization
______________________________________________________________________
Discrete Optimization, in which you require some or all of the variables to have
integer values.
______________________________________________________________________
______________________________________________________________________