Unconstrained optimization methods
Unconstrained optimization methods
• Variables, x1 x2 x3 and so on, which are the inputs – things you can control. They are
• His main goal is to maximize running yards – this will become his objective function.
• He can make his athletes spend practice time in the weight room; running sprints; or
practicing ball protection. The amount of time spent on each is a variable.
• However, there are limits to the total amount of time he has. Also, if he completely
sacrifices ball protection he may see running yards go up, but also fumbles, so he
may place an upper limit on the amount of fumbles he considers acceptable. These
are constraints.
Note that the variables influence the objective function and the constraints place limits
on the domain of the variables.
Practice Question 1
Group the following into what might be maximized, minimized or cannot be optimized.
1. When choosing a new phone and plan, you might consider: minutes of talk time per
month; how much is charged for overages; whether extra minutes roll over; amount of
data allowed; cost per month; amount of storage/memory; how many phones are
available; brands/types of available phones; cost of the phone; amount of energy used;
time it takes to download apps or music; whether or not you get signal in your home.
Practice Question 2
2. An airplane designer is trying to build the most fuel-efficient airplane possible.
Write one factor as an objective (“Minimize/maximize _____”) and the rest as
constraints ( “_____ ≤ c1”, or ≥ or =). Delete any non-numerical factors:
speed, fuel consumption, range, noise, weight, cost, ease of use, amount of lift,
amount of drag, sonic boom volume, payload (how much it can carry).
Practice Questions 3-5
For each of the following tasks, write an objective function (“maximize ____”) and at
least two constraints (“subject to _____ ≤ c 1”, or ≥ or =)
3. A student must create a poster project for a class.
4. A shipping company must deliver packages to customers.
5. A grocery store must decide how to organize the store layout.
Optimization Methods
Types of Optimization Problems
• Some problems have constraints and some do not.
unlimited limited
Types of Optimization Problems
• Some problems have constraints and some do not.
• There can be one variable or many.
x1 x3
x4 x2
x6 x5
x7
x8
Types of Optimization Problems
• Some problems have constraints and some do not.
• There can be one variable or many.
• Variables can be discrete (for example, only have integer values) or continuous.
Types of Optimization Problems
• Some problems have constraints and some do not.
• There can be one variable or many.
• Variables can be discrete (for example, only have integer values) or continuous.
• Some problems are static (do not change over time) while some are dynamic
(continual adjustments must be made as changes occur).
Types of Optimization Problems
• Some problems have constraints and some do not.
• There can be one variable or many.
• Variables can be discrete (for example, only have integer values) or continuous.
• Some problems are static (do not change over time) while some are dynamic
(continual adjustments must be made as changes occur).
• Systems can be deterministic (specific causes produce specific effects) or stochastic
(involve randomness/ probability).
?
? ?
Types of Optimization Problems
• Some problems have constraints and some do not.
• There can be one variable or many.
• Variables can be discrete (for example, only have integer values) or continuous.
• Some problems are static (do not change over time) while some are dynamic
(continual adjustments must be made as changes occur).
• Systems can be deterministic (specific causes produce specific effects) or stochastic
(involve randomness/ probability).
• Equations can be linear (graph to lines) or nonlinear (graph to curves)
Unconstrained Optimization
Unconstrained optimization problems arise directly in some applications but they also arise indirectly
from reformulations of constrained optimization problems. Often it is practical to replace the constraints
of an optimization problem with penalized terms in the objective function and to solve the problem as an
unconstrained problem.
Unconstrained…
At a high level, algorithms for unconstrained minimization follow this general structure:
• Choose a starting point .
• Beginning at , generate a sequence of iterates with non-increasing function (f) value until a solution point with
sufficient accuracy is found or until no further progress can be made.
In a similar way can be formulated general structure for unconstrained maximization. Just we generate a sequence
of iterates with non-decreasing function.
To generate the next iterate , the algorithm uses information about the function at and possibly earlier iterates.
Extrema points of the function
Let function f(x) is defined on the interval [a,b].
Local vs. Global Extremes
If a point is a maximum or minimum relative to the other points in its “neighborhood”,
then it is considered a local maximum or local minimum.
Local vs. Global Extremes
If a point is a maximum or minimum relative to the other points in its “neighborhood”,
then it is considered a local maximum or local minimum.
Local maximum
Local minima
Local vs. Global Extremes
If a point is a maximum or minimum relative to all the other points on the function,
then it is considered a global maximum or global minimum.
No global maximum
Global minimum
Extrema points of the function
Let function f(x) is defined on the interval [a,b].
Analogically (just with the opposite inequalities) can be defined the local and global maximum
points.
Since
we will mostly consider minimum of the function because the theory for the maximum is similar
Types of minima
A unimodal function has only one minimum and the rest of the graph goes up from there; or one maximum and
the rest of the graph goes down.
With unimodal functions, any extreme you find is guaranteed to be the global extreme.
Unimodal and Multimodal Functions
A bimodal function has two local minima or maxima.
With bimodal and above, you don’t know if an extreme is local or global unless you
know the entire graph.
Practice Problems 1 and 2
2. Draw a trimodal function with no global maximum. On your function label the local and global minima, and
the local maxima.
4. If a smooth function has n extrema with no global minimum, how many local maxima will it have? How
many local minima will it have? How many total local extremes?
Methods of finding extrema
• Analytical method:
Theorem. If function f(x) on the interval [a,b] has a derivative and at the point c from
this interval has local minimum or maximum, then its derivative at that point c is
equal to 0: f’(c)=0.
Theorem. Continuous function f(x), defined on the interval [a,b] takes its the biggest
(the least) value whether at the extrema points or at the ends of the interval.
Some useful derivatives
Exponents
Linear Functions
f ( x) Ax n f ' ( x) nAx n 1
f ( x) Ax f ' ( x) A
Example: f ( x) 4 x Example: f ( x) 3 x 5
f ' x 4 f ' x 15x 4
Logarithms
A
f ( x) A ln( x) f ' ( x)
x
Example: f ( x) 12 ln x
12
f ' x
x
A necessary condition for a maximum or a minimum is that the derivative equals zero
f x f x
f ' x 0
OR
f ' x 0
x *
x
x* x
f ' x 0
f x f x
f ' x 0
f ' x 0
f ' x 0
x *
x
x* x
Now, take a
derivative with R 105 1.6t .04t 2
respect to ‘t’ and set
it equal to zero
R ' 1.6 .08t
P $2.20
Then, solve for ‘t’
*
t 20 Q 55
R $121
We want to minimize cost per mile
40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 Speed
Suppose that you run a trucking company. You have
the following expenses.
Expenses
• Driver Salary: $22.50 per hour
• $0.27 per mile depreciation What speed should you tell your driver to
• v/140 dollars per mile for fuel costs where maintain in order to minimize your cost
‘v’ is speed in miles per hour per mile?
What speed should you tell your driver to maintain in order to
minimize your cost per mile?
22.50 v
Cost .27
Note that if I divide the v 140
driver’s salary by the speed, I
get the driver’s salary per
mile
$ Now, take a 22.50 1
$ derivative with C ' 2
0
hr. respect to ‘t’ and set v 140
miles mile it equal to zero
hr.
Then, solve for ‘t’ v 3,150 56 mph
Cost $1.07 per mile
We want to minimize cost per mile
Cost Per Mile
40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 Speed
Remark. Usually in real situations it is hard to find a derivative, or to solve an equation f’(c)=0, or even to
know the expression of the function, what means that we know only some values of it from the observations.
Thus we need other methods.
Re-election algorithm: construct a net of the points xi=a+i*h, i=0,1,…,n; faind values of the function at those
points fi=f(xi) and choose extrema points. Such method is reliable but not effective.
Unimodal function
We consider the following minimization problem
min f(x)
s.t. x [a; b]
The non-decimal representation is (–1+√5)/2. It has the interesting property that 2 = 1–, among others.
Expressed algebraically:
The Problem
Recall that we need three points to verify that a minimum lies within an interval. If you just take the midpoint of
the interval…
then you still don’t know if the actual minimum lies left or right of the endpoint.
The Solution
Instead, we divide the interval into three sections instead of two by choosing two
interior points instead of one.
Although it would seem obvious to divide the segment into equal thirds, with points
at .33 and .67 across the segment, there is a better way.
The Solution
Rather than divide our segment into equal thirds, it’s preferred to divide it according to
the Golden Ratio, at 0.382… and 0.618…
left
right
interval
interval
The Next Step
New point
The Next Step
The cool property of the Golden Ratio is that if we use the exact value of , one of the
two interior points can be used again – it’s already there.
Golden section method
Assume the objective function f is unimodal over [a; b].
Also
x2 x1 (b a ) ( x1 a ) (b x2 )
And We are interested in the solution which is < 1:
x2 x1 (b a ) ( x1 a ) (b x2 )
3− √ 5
𝜌= ≈ 0 , 38
x2 a (b a ) (b x2 ) 2
Expressing (b - a) in the right side of the equation we get: Finally:
1−2 𝜌 𝑥1=𝑎+(𝑏− 𝑎) 𝜌
2
𝜌= or 𝜌 − 3 𝜌+1=0
1− 𝜌 𝑥 2=𝑏−(𝑏− 𝑎)𝜌
Golden section examples in the real life
Golden section algorithm
Example 1
Example 1
Example 1
Example 2
Example 2
Fibonacci method
We continue to discuss the minimization problem where f(x) is unimodal on [a; b].
In the golden search method, two function evaluations are made at the first iteration and then only one function
evaluation is made for each subsequent iteration. The ratio for the reduction of intervals at each iteration
remains constant.
The Fibonacci method differs from the golden ratio method in that the ratio for the reduction of intervals is not
constant. Additionally, the number of subintervals (iterations) is predetermined and based on the specified
tolerance.
Fibonacci Search
Example 1
Example 1
Example 1
Example 1
Example 1
Fibonacci method
Fibonacci algorithm
Example 2
Approximation methods
Polynomial interpolation
• Bracket the minimum.
• Fit a quadratic or cubic polynomial which
interpolates f(x) at some points in the interval.
• Jump to the (easily obtained) minimum of the
polynomial.
• Throw away the worst point and repeat the
process.
y y0 f ( x0 )( x x0 )
Iterations formula:
f ( xn )
xn 1 xn '
f ( xn )
Newton method
f ( xn )
xn 1 xn '
f ( xn )
Since we searching for the points where the derivative of the function is equal to
zero, i.e. we are solving the equation
f ' ( x) 0
f ' ( xn )
xn 1 xn ''
f ( xn )
Newton method
• Global convergence of Newton’s method is poor.
• Often fails if the starting point is too far from the minimum.
• in practice, must be used with a globalization strategy which reduces the step length until function
decrease is assured.
Extension to n (multivariate) dimensions
• How big n can be?
– problem sizes can vary from a handful of parameters to many thousands
• For n =2 the cost function surfaces can be visualized.