Week02 Convex Optimization
Week02 Convex Optimization
OPTIMIZATION
FUNDAMENTALS OF OPTIMIZATION
Week 2: Convex Optimization
3
Outline
1. Definitions
• Convex set
• Convex combination
• Convex function
• Exercises
2. Unconstrained Optimization
3. Constrained Optimization
4
Convex set
5
Examples of convex set
6
Examples of convex set
𝑥 2 = 𝑥𝑇𝑥 = 𝑥𝑖2
𝑖=1
7
Convex combination
8
Convex function
9
Exercises
1. 𝑆 = 𝑥, 𝑦 0 ≤ 𝑥 ≤ 𝑎, 0 ≤ 𝑦 ≤ 𝑏} is a convex set ?
2. Union of two convex sets is a convex set ?
3. 𝑆 = 𝑥 ∈ 𝑅𝑛 ||𝑥 − 𝑎|| ≤ 𝑟} is a convex set ?
4. 𝑆 = 𝑥, 𝑦, 𝑧 𝑧 ≥ 𝑥 2 + 𝑦 2 } is a convex set ?
10
Exercises
𝑆= 𝑥, 𝑦 0 ≤ 𝑥 ≤ 𝑎, 0 ≤ 𝑦 ≤ 𝑏} is a convex set ?
11
Exercises
12
Convex properties
16
Convex properties
17
Exercises
18
Exercises
19
Outline
1. Definitions
2. Unconstrained Optimization
• Introduction to unconstrained optimization
• Descent method
• Newton method
3. Constrained Optimization
20
Unconstrained convex optimization
21
Local minimizer
22
Local minimizer
f(x) = ex + e-x - 3x2 + x has two local f(x) = ex + e-x - 3x2 has two global minimizers
minimizers and one global minimizer
23
Local minimizer
f(x) = ex + e-x - 3x2 + x has two local f(x) = ex + e-x - 3x2 has two global minimizers
minimizers and one global minimizer
24
Local minimizer
Example
• f(x, y) = x2 + y2 - 2xy + x
• f(x, y) = 2x - 2y + 1 = 0 has no solution
2y - 2x
25
Local minimizer
26
Local minimizer
27
Examples
2 2
Example f(x,y) = 𝑒 𝑥 +𝑦
𝑥2+𝑦2
f(x) = 2x𝑒 = 0 has unique solution x* = (0,0)
𝑥2+𝑦2
2y𝑒
2 0
2f(x*) = > 0 → (0,0) is a minimizer of f
0 2
28
Examples
-2x + 2y + 1
f(x) = -2x -2y =0
-2 2
2f(x*) = is not positive definite
-2 -2
→ cannot conclude x*
29
Descent method
30
Gradient descent method
𝑓
• k might be specified in such a way that f(x(k-1) - kf(x(k-1))) is minimized: =0
𝑘
31
Minimize 𝒇 𝒙 = 𝒙𝟒 − 𝟐𝒙𝟑 − 𝟔𝟒𝒙𝟐 + 𝟐𝒙 + 𝟔𝟑
32
Minimize 𝒇 𝒙 = 𝒙𝟒 + 𝟑𝒙𝟐 − 𝟏𝟎 𝒙 + 𝟒
33
Minimize 𝒇 𝒙 = 𝒙𝟐 + 𝟓𝒔𝒊𝒏(𝒙)
34
Minimize 𝒇 𝒙, 𝒚 = 𝒙𝟐 + 𝒚𝟐 + 𝒙𝒚 − 𝒙 − 𝒚
35
Minimize 𝒇 𝒙 = 𝒙𝟐𝟏 + 𝒙𝟐𝟐 + 𝒙𝟐𝟑 − 𝒙𝟏 𝒙𝟐 − 𝒙𝟐 𝒙𝟑 + 𝒙𝟏 + 𝒙𝟑
36
Newton method
37
Minimize 𝒇 𝒙 = 𝒙𝟐𝟏 + 𝒙𝟐𝟐 + 𝒙𝟐𝟑 − 𝒙𝟏 𝒙𝟐 − 𝒙𝟐 𝒙𝟑 + 𝒙𝟏 + 𝒙𝟑
import numpy as np
def newton(f,df,Hf,x0):
x = x0
for i in range(10):
iH = np.linalg.inv(Hf(x))
D = np.array(df(x)).T #transpose matrix: convert from list to
#column vector
print('df = ',D)
y = iH.dot(D) #multiply two matrices
if np.linalg.norm(y) == 0:
break
x = x - y
print('Step ',i,': ',x,' f = ',f(x))
38
Minimize 𝒇 𝒙 = 𝒙𝟐𝟏 + 𝒙𝟐𝟐 + 𝒙𝟐𝟑 − 𝒙𝟏 𝒙𝟐 − 𝒙𝟐 𝒙𝟑 + 𝒙𝟏 + 𝒙𝟑
def main():
print('main start....')
f = lambda x: x[0] ** 2 + x[1] ** 2 + x[2] ** 2 - x[0] * x[1] - x[1] *
x[2] + x[0] + x[2] # function f to be minimized
df = lambda x: [2 * x[0] + 1 - x[1], -x[0] + 2 * x[1] - x[2], -x[1] + 2
* x[2] + 1] # gradient
Hf = lambda x: [[2,-1,0],[-1,2,-1],[0,-1,2]]# Hessian
x0 = np.array([0,0,0]).T
newton(f,df,Hf,x0)
if __name__ == '__main__':
main()
39
Outline
1. Definitions
2. Unconstrained Optimization
3. Constrained Optimization
• General constrained optimization problem
40
General constrained optimization problem
41
Lagrangian function
𝐿 𝑥, 𝜆 = 𝑓 𝑥 + 𝜆𝑖 × 𝑔𝑖 (𝑥)
𝑖=1
42
Lagrangian function - properties
𝐿 𝑥, 𝜆 = 𝑓 𝑥 + 𝜆𝑖 × 𝑔𝑖 (𝑥)
𝑖=1
• Theorem: The optimal value of the optimization problem is the following
property: ∇𝑥, 𝜆 𝐿 𝑥, 𝜆 = 0
43
Lagrangian multiplier method
∇𝑓 𝑥1 , … , 𝑥𝑛 = 𝜆𝑖 𝑔𝑖 𝑥1 , … , 𝑥𝑛
𝑖=1
𝑔1 𝑥1 , … , 𝑥𝑛 =0
…
𝑔𝑚 𝑥1 , … , 𝑥𝑛 = 0
44
Example 1:
• Notice that, we can’t have 𝜆 = 0 since that would not satisfy the first two equations. So, since
we know that 𝜆 ≠ 0 we can solve the first two equations for 𝑥 and 𝑦 respectively. This gives,
45
Example 1
• Now, that we know 𝜆 we can find the points that will be potential maximums and/or minimums.
1
• If 𝜆 = − we get 𝑥 = −10, 𝑦 = 6
4
1
• if 𝜆 = we get 𝑥 = 10, 𝑦 = −6
4
• So,
46
Example 2:
• Problem:
• Objective function: 𝑚𝑎𝑥𝑖𝑚𝑖𝑧𝑒 𝑓 𝑥, 𝑦 = 𝑥𝑦
• Constraint: 𝑔 𝑥, 𝑦 = 10𝑥 + 20𝑦 − 400 = 0
• Solution
• Form the Lagrange function:
𝐿 𝑥, 𝑦, 𝜇 = 𝑓 𝑥, 𝑦 − 𝜇 𝑔 𝑥, 𝑦
𝐿 𝑥, 𝑦, 𝜇 = 𝑥𝑦 − 𝜇(10𝑥 + 20𝑦 − 400)
• Set each first order partial derivative equal to zero:
47
Example 2:
• Problem:
• Objective function: 𝑚𝑎𝑥𝑖𝑚𝑖𝑧𝑒 𝑢 𝑥, 𝑦 = 𝑥𝑦
• Constraint: 𝑔 𝑥, 𝑦 = 10𝑥 + 20 − 400 = 0
• Solution:
• Set each first order partial derivative equal to zero:
• So,
48
Example 3:
• Problem:
• Objective function: 𝑚𝑎𝑥𝑖𝑚𝑖𝑧𝑒 𝑓 𝑥, 𝑦 = 𝑥 + 𝑦
• Constraint: 𝑔 𝑥, 𝑦 = 𝑥 2 + 𝑦 2 − 2 = 0
• Solution:
• Form the Lagrange function:
49
Example 3
• Problem:
• Objective function: 𝑚𝑎𝑥𝑖𝑚𝑖𝑧𝑒 𝑓 𝑥, 𝑦 = 𝑥 + 𝑦
• Constraint: 𝑔 𝑥, 𝑦 = 𝑥 2 + 𝑦 2 − 2 = 0
• Solution:
• Set each first order partial derivative equal to zero:
𝜕𝐿
• = 1 + 2𝜆𝑥 = 0
𝜕𝑥
𝜕𝐿
• = 1 + 2𝜆𝑦 = 0
𝜕𝑦
𝜕𝐿
• = 𝑥2 + 𝑦2 − 2 = 0
𝜕𝜆
• We have , 𝑥, 𝑦 ∈ 1,1 , −1, −1
• So, (𝑥, 𝑦) = (1,1)
50
THANK YOU !
51