Assignment
Assignment
Exercise 1 (40/100)
Consider the highly non-linear Rosenbrock’s function:
f (x, y) := (1 − x)2 + 100(y − x2 )2 (1)
1. Implement in MATLAB two functions:
• the Newton’s method (Newton.m)
• the Steepest descent (Gradient) method (GD.m)
where both methods can be run with backtracking algorithm (backtracking.m), for reference see
J. Nocedal and S. Wright, Numerical optimization, page 37 and with step size β = 1. You can
reuse your previous implementation of Steepest descent method and simply extend it in a way that
it can use the backtracking algorithm for computing the step size β. Use the following values for
the backtracking parameters: α̃ = 1, ρ = 0.9. You can choose the parameter c1 such that it is in
the interval {0.5, 10−4 }.
2. Minimize the Rosenbrock’s function (1) by using the Steepest Descent (Gradient) method with
backtracking and fixed step size β = 1. Use starting value (0, 0), maximum number of iterations
N = 50000 and tolerance TOL= 10−6 .
3. Minimize the Rosenbrock’s function (1) by using Newton method with backtracking and fixed step
size β = 1. Use starting value (0, 0), maximum number of iterations N = 50000 and tolerance
TOL= 10−6 .
4. Plot the obtained iterates on the energy landscape in 2D.
5. Analyze convergence behaviour of the methods by plotting the gradient norm and the function
value at each iteration.
6. Compare and comment on the the performances of the different methods.
Exercise 2 (40/100)
Consider again the highly non-linear Rosenbrock’s function:
f (x, y) := (1 − x)2 + 100(y − x2 )2 (2)
1. Implement the BFGS method (BFGS.m) with backtraking for the step size β. For reference see J.
Nocedal and S. Wright, Numerical optimization, page 141.
2. Test your implementation by minimizing the Rosenbrock’s function. Use starting values x0 =
(0, 0), H0 = I, maximum number of iterations N = 500 and tolerance TOL= 10−6 .
3. Plot the obtained iterates on the energy landscape in 2D.
4. Analyze convergence behaviour of the methods by plotting the gradient norm and the function
value at each iteration.
5. Produce a table in which you compare the number of iterations required by BFGS, by Newton’s
method (with backtracking) and by Steepest descent method (with backtracking). You can use the
results from the previous exercise. Comment the results by comparing the different methods.