0% found this document useful (0 votes)
42 views

Chapter Four and Five

This document discusses dynamic programming and several examples of problems that can be solved using dynamic programming including the Fibonacci sequence, multistage graph problems, the all pairs shortest path problem, the 0-1 knapsack problem and running time analysis of algorithms.

Uploaded by

hagosabate9
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views

Chapter Four and Five

This document discusses dynamic programming and several examples of problems that can be solved using dynamic programming including the Fibonacci sequence, multistage graph problems, the all pairs shortest path problem, the 0-1 knapsack problem and running time analysis of algorithms.

Uploaded by

hagosabate9
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 56

CHAPTER FOUR

DYNAMIC PROGRAMMING
OUTLINE
• General Method
• Multistage graph problem
• All pairs shortest pattern
• o/I knapsack problem
• Travelling salesperson problem
• Disconnected components
Dynamic programming
Dynamic Programming is an algorithm design technique for optimization
problems: often minimizing or maximizing.

Like divide and conquer, DP solves problems by combining solutions to sub


problems.

Unlike divide and conquer, sub problems are not independent.

 Sub-problems may share sub-sub-problems,

 However, solution to one sub-problem may not affect the solutions to other
sub problems of the same problem. DP reduces computation by
 Solving sub problems in a bottom-up fashion.

 Storing solution to a sub-problem the first time it is solved.

 Looking up the solution when sub-problem is encountered again.


E.g. of Dynamic Programming
• Fibonacci sequence: 0 , 1 , 1 , 2 , 3 , 5 , 8 , 13 , 21 , …
Fi = i if i  1
Fi = Fi-1 + Fi-2 if i  2
• Solved by a recursive program:
f5

f4 f3

F(n)
f3 f2 f2 f1

F(n-1) + F(n-2)
f2 f1 f1 f0 f1 f0

F(n-2) + F(n-3) F(n-3) + F(n-


f1 f0
4)
• Much replicated computation is done.
• It should be solved by a simple loop.
E.g. Dynamic Programming
Computing the nth Fibonacci number using bottom-up iteration
and recording results:

F(0) = 0
F(1) = 1
F(2) = 1+0 = 1

F(n-2) =
F(n-1) =
F(n) = F(n-1) + F(n-2)

0 1 1 . . . F(n-2) F(n-1) F(n)


Efficiency:
- time O(n)
- space O(n)
Steps in Dynamic Programming

1. Characterize structure of an optimal


solution.
2. Define value of optimal solution recursively.
3. Compute optimal solution values either top-
down with caching or bottom-up in a table.
4. Construct an optimal solution from
computed values.
Dynamic Programming

Principle of optimality: Suppose that in solving a


problem, we have to make a sequence of decisions D 1,

D2, …, Dn. If this sequence is optimal, then the last k

decisions, 1  k  n must be optimal.


e.g. the shortest path problem If i, i1, i2, …, j is a

shortest path from i to j, then i1, i2, …, j must be a

shortest path from i1 to j


Dynamic Programming
The shortest path
• To find a shortest path in a multi-stage graph
3 2 7

1 4
S A B 5
T

5 6
• Apply the greedy method :
the shortest path from S to T :
1+2+5=8
Multistage graph problem
• A multistage graph G = (V, E) is a directed graph where
vertices are partitioned into k (where k > 1) number of
disjoint subsets S = {s1,s2,…,sk} such that edge (u, v) is in E,
then u Є si and v Є s1 + 1 for some subsets in the partition and
|s1| = |sk| = 1.
• The vertex s Є s1 is called the source and the vertex t Є sk is
called sink.
• G is usually assumed to be a weighted graph. In this graph,
cost of an edge (i, j) is represented by c(i, j). Hence, the cost
of path from source s to sink t is the sum of costs of each
edges in this path.
• The multistage graph problem is finding the path with
minimum cost from source s to sink t.
Multistage graph problem
 There are two Approach's in DP
1. Forward approach and
2. backward approach:
 Note that if the recurrence relations are formulated using the
forward approach then the relations are solved backwards .
i.e., beginning with the last decision
 On the other hand if the relations are formulated using the
backward approach, they are solved forwards.
 To solve a problem by using dynamic programming:
 Find out the recurrence relations.
 Represent the problem by a multistage graph.
Example of multistage graph problem
Forward approach
cost(4,I) = c(I,L) = 7
cost(4,J) = c(J,L) = 8
cost(4,K) = c(K,L) = 11

cost(3,F) = min { c(F,I) + cost(4,I) | c(F,J) + cost(4,J) }


cost(3,F) = min { 12 + 7 | 9 + 8 } = 17
cost(3,G) = min { c(G,I) + cost(4,I) | c(G,J) + cost(4,J) }
cost(3,G) = min { 5 + 7 | 7 + 8 } = 12
cost(3,H) = min { c(H,J) + cost(4,J) | c(H,K) + cost(4,K) }
cost(3,H) = min { 10 + 8 | 8 + 11 } = 18
cost(2,B) = min { c(B,F) + cost(3,F) | c(B,G) + cost(3,G) | c(B,H) + cost(3,H) }
cost(2,B) = min { 4 + 17 | 8 + 12 | 11 + 18 } = 20
cost(2,C) = min { c(C,F) + cost(3,F) | c(C,G) + cost(3,G) }
cost(2,C) = min { 10 + 17 | 3 + 12 } = 15
cost(2,D) = min { c(D,H) + cost(3,H) }
cost(2,D) = min { 9 + 18 } = 27
cost(2,E) = min { c(E,G) + cost(3,G) | c(E,H) + cost(3,H) }
cost(2,E) = min { 6 + 12 | 12 + 18 } = 18
cost(1,A) = min { c(A,B) + cost(2,B) | c(A,C) + cost(2,C) | c(A,D) + cost(2,D) |
c(A,E) + cost(2,E) }
cost(1,A) = min { 7 + 20 | 6 + 15 | 5 + 27 | 9 + 18 } = 21
The path it contains minimum distance is A-C-G-I-L = 21
Backward approach
cost(2,B) = c(A,B) = 7
cost(2,C) = c(A,C) = 6
cost(2,D) = c(A,D) = 5
cost(2,E) = c(A,E) = 9.
cost(3,F) = min { c(B,F) + cost(2,B) | c(C,F) + cost(2,C) }
cost(3,F) = min { 4 + 7 | 10 + 6 } = 11
cost(3,G) = min { c(B,G) + cost(2,B) | c(C,G) + cost(2,C) | c(E,G)
+ cost(2,E) }
cost(3,G) = min { 8 + 7 | 3 + 6 | 6 + 9 } = 9
cost(3,H) = min { c(B,H) + cost(2,B) | c(D,H) + cost(2,D) | c(E,H)
+ cost(2,E) }
cost(3,H) = min { 11 + 7 | 9 + 5 | 12 + 9 } = 14
cost(4,I) = min { c(F,I) + cost(3,F) | c(G,I) + bcost(3,G) }
cost(4,I) = min { 12 + 11 | 5 + 9 } = 14
cost(4,J) = min { c(F,J) + cost(3,F) | c(G,J) + cost(3,G) | c(H,J)
+ cost(3,H) }
cost(4,J) = min { 9 + 11 | 7 + 9 | 10 + 14 } = 16
cost(4,K) = min { c(H,K) + cost(3,H) }
cost(4,K) = min { 8 + 14 } = 22
cost(5,L) = min { c(I,L) + cost(4,I) | c(J,L) + cost(4,J) | c(K,L)
+ cost(4,K) }
cost(5,L) = min { 7 + 14 | 8 + 16 | 11 + 22 } = 21
All pairs shortest path problem
• In the all pairs shortest path problem, we want to find the shortest path from
every possible source to every possible destination. Specifically, for every pair of
vertices u and v, we need to compute the following information:

• dist(u, v) is the length of the shortest path (if any) from u to v;

• pred(u, v) is the second-to-last vertex (if any) on the shortest path (if any) from u
to v.

• For example, for any vertex v, we have dist(v, v) = 0 and pred(v, v) = Null. If the

shortest path from u to v is only one edge long, then dist(u, v) = w(uv) and pred(u,

v) = u. If there is no shortest path from u to v—either because there’s no path at

all, or because there’s a negative cycle—then dist(u, v) = ∞ and pred(v, v) = Null.


Algorithm
• Find the distance between Algorithm AllPair(G) {assumes vertices 1,
…,n}
for all vertex pairs (i,j)
every pair of vertices in a if i = j
D0[i,i]  0
weighted directed graph G. else if (i,j) is an edge in G
D0[i,j]  weight of edge
• We can make n calls to (i,j)
else
Dijkstra’s algorithm (if no D0[i,j]  + 
for k  1 to n do
negative edges), which takes for i  1 to n do
for j  1 to n do
O(nmlog n) time. Dk[i,j]  min{Dk-1[i,j],
Dk-1[i,k]+Dk-1[k,j]}
• Likewise, n calls to Bellman- return Dn

Ford would take O(n2m) time.

• We can achieve O(n3) time


Example
Cont…
0-1 Knapsack Problem
• Given a knapsack with maximum capacity W, and a
set S consisting of n items
• Each item i has some weight wi and benefit value bi
(all wi and W are integer values)
• Problem: How to pack the knapsack to achieve
maximum total value of packed items?
• Problem, in other words, is to find
max  bi subject to  wi  W
iT iT

Where bi=benefit for each weight


wi=weight of each item and W = maximum
0-1 Knapsack Algorithm
for w = 0 to W
V[0,w] = 0
for i = 1 to n
V[i,0] = 0
for i = 1 to n
for w = 1 to W
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Running time
for w = 0 to W
V[0,w] = 0
O(W)
for i = 1 to n
V[i,0] = 0
for i = 1 to n
for w = 0 to W
Repeat n times

What is the running time of this


algorithm?
O(n*W)
Example
Let’s run our algorithm on the following data:
n = 4 (# of elements) W = 5 (max weight)
Elements (weight, benefit):(2,3), (3,4), (4,5), (5,6)
Items:
CONT… 1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=1 4: (5,6)
0 0 0 0 0 0 0
bi=3
1 0 0
wi=2
2 0
3 0 w=1
4 0 w-wi =-1
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Items:
CONT… 1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=1 4: (5,6)
0 0 0 0 0 0 0
bi=3
1 0 0 3
wi=2
2 0
3 0 w=2
4 0 w-wi =0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Items:
CONT… 1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=1 4: (5,6)
0 0 0 0 0 0 0
bi=3
1 0 0 3 3
wi=2
2 0
3 0 w=3
4 0 w-wi =1
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Items:
CONT… 1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=1 4: (5,6)
0 0 0 0 0 0 0
bi=3
1 0 0 3 3 3
wi=2
2 0
3 0 w=4
4 0 w-wi =2
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Items:
CONT… 1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=1 4: (5,6)
0 0 0 0 0 0 0
bi=3
1 0 0 3 3 3 3
wi=2
2 0
3 0 w=5
4 0 w-wi =3
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Items:
CONT… 1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=2 4: (5,6)
0 0 0 0 0 0 0
bi=4
1 0 0 3 3 3 3
wi=3
2 0 0
3 0 w=1
4 0 w-wi =-2
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Items:
CONT… 1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=2 4: (5,6)
0 0 0 0 0 0 0
bi=4
1 0 0 3 3 3 3
wi=3
2 0 0 3
3 0 w=2
4 0 w-wi =-1
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Items:
1: (2,3)
CONT… 2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=2 4: (5,6)
0 0 0 0 0 0 0
bi=4
1 0 0 3 3 3 3
wi=3
2 0 0 3 4
3 0 w=3
4 0 w-wi =0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Items:
1: (2,3)
CONT… 2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=2 4: (5,6)
0 0 0 0 0 0 0
bi=4
1 0 0 3 3 3 3
wi=3
2 0 0 3 4 4
3 0 w=4
4 0 w-wi =1
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Items:
1: (2,3)
CONT… 2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=2 4: (5,6)
0 0 0 0 0 0 0
bi=4
1 0 0 3 3 3 3
wi=3
2 0 0 3 4 4 7
3 0 w=5
4 0 w-wi =2
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Items:
1: (2,3)
CONT… 2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=3 4: (5,6)
0 0 0 0 0 0 0
bi=5
1 0 0 3 3 3 3
wi=4
2 0 0 3 4 4 7
3 0 0 3 4 w= 1..3
4 0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
CONT… Items:
1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=3 4: (5,6)
0 0 0 0 0 0 0
bi=5
1 0 0 3 3 3 3
wi=4
2 0 0 3 4 4 7
3 0 0 3 4 5 w= 4
4 0 w- wi=0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Items:
CONT… 1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=3 4: (5,6)
0 0 0 0 0 0 0
bi=5
1 0 0 3 3 3 3
wi=4
2 0 0 3 4 4 7
3 0 0 3 4 5 7 w= 5
4 0 w- wi=1
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Items:
CONT… 1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=4 4: (5,6)
0 0 0 0 0 0 0
bi=6
1 0 0 3 3 3 3
wi=5
2 0 0 3 4 4 7
3 0 0 3 4 5 7 w= 1..4
4 0 0 3 4 5
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Items:
CONT… 1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=4 4: (5,6)
0 0 0 0 0 0 0
bi=6
1 0 0 3 3 3 3
wi=5
2 0 0 3 4 4 7
3 0 0 3 4 5 7 w= 5
4 0 0 3 4 5 7 w- wi=0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Items:
CONT… 1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=4 4: (5,6)
0 0 0 0 0 0 0
bi=6
1 0 0 3 3 3 3
wi=5
2 0 0 3 4 4 7
3 0 0 3 4 5 7 w= 5
4 0 0 3 4 5 7 w- wi=0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Travelling Salesperson Problem
• A traveling salesman wishes to go to a certain number of destinations
in order to sell objects.

• He wants to travel to each destination exactly once and return home


taking the shortest total route.

• Let us consider a graph G = (V, E), where V is a set of cities and E is a


set of weighted edges.

• An edge e(u, v) represents that vertices u and v are connected.

• Distance between vertex u and v is d(u, v), which should be non-


negative.
CONT…
• When |S| > 1, we define C(S, 1) = ∝ since the path
cannot start and end at 1.
• Now, let express C(S, j) in terms of smaller sub-
problems. We need to start at 1 and end at j. We
should select the next city in such a way that
 C(S,j)=min C(S−{j},i)+d(i,j)
where i∈S and i≠j
C(S,j)=min C(S−{j},i)+d(i,j)
where i∈S and i≠ j
Algorithm
C ({1}, 1) = 0
for s = 2 to n do
for all subsets S Є {1, 2, 3, … , n} of size s and
containing 1
C (S, 1) = ∞
for all j Є S and j ≠ 1
C (S, j) = min {C (S – {j}, i) + d(i, j) for i Є S and i ≠ j}
Return min j C ({1, 2, 3, …, n}, j) + d(j, i)
Example
1 2

4 3

• S=Φ
Cost(2,Φ,1)=d(2,1)=5 Cost(2,Φ,1)=d(2,1)=5
Cost(3,Φ,1)=d(3,1)=6 Cost(3,Φ,1)=d(3,1)=6
Cost(4,Φ,1)=d(4,1)=8 Cost(4,Φ,1)=d(4,1)=8
Cont…
• S = 1
• Cost(i,s)=min{Cost(j,s–j))+d[i,j]}
• Cost(2,{3},1)=d[2,3]+Cost(3,Φ,1)=9+6=15Cost(2,
{3},1)=d[2,3]+Cost(3,Φ,1)=9+6=15
• Cost(2,{4},1)=d[2,4]+Cost(4,Φ,1)=10+8=18Cost(2,
{4},1)=d[2,4]+Cost(4,Φ,1)=10+8=18
• Cost(3,{2},1)=d[3,2]+Cost(2,Φ,1)=13+5=18Cost(3,
{2},1)=d[3,2]+Cost(2,Φ,1)=13+5=18
• Cost(3,{4},1)=d[3,4]+Cost(4,Φ,1)=12+8=20Cost(3,
{4},1)=d[3,4]+Cost(4,Φ,1)=12+8=20
• Cost(4,{3},1)=d[4,3]+Cost(3,Φ,1)=9+6=15Cost(4,
{3},1)=d[4,3]+Cost(3,Φ,1)=9+6=15
• Cost(4,{2},1)=d[4,2]+Cost(2,Φ,1)=8+5=13Cost(4,
{2},1)=d[4,2]+Cost(2,Φ,1)=8+5=13
Cont…
•S = 2
• Cost(2,{3,4},1)={d[2,3]+Cost(3,
{4},1)=9+20=29d[2,4]+Cost(4,{3},1)=10+15=25=25Cost(2,
{3,4},1)={d[2,3]+Cost(3,{4},1)=9+20=29d[2,4]+Cost(4,
{3},1)=10+15=25=25

• Cost(3,{2,4},1)={d[3,2]+Cost(2,
{4},1)=13+18=31d[3,4]+Cost(4,{2},1)=12+13=25=25Cost(3,
{2,4},1)={d[3,2]+Cost(2,{4},1)=13+18=31d[3,4]+Cost(4,
{2},1)=12+13=25=25

• Cost(4,{2,3},1)={d[4,2]+Cost(2,
{3},1)=8+15=23d[4,3]+Cost(3,{2},1)=9+18=27=23Cost(4,
Cont…
•S = 3
• Cost(1,{2,3,4},1)=((d[1,2]+Cost(2,{3,4},1)=10+25=35
d[1,3]+Cost(3,{2,4},1)=15+25=40 d[1,4]+Cost(4,
{2,3},1)=20+23=43=35Cost(1,{2,3,4},1)={ d[1,2]+Cost(2,
{3,4},1)=10+25=35 d[1,3]+Cost(3,{2,4},1)=15+25=40
d[1,4]+Cost(4,{2,3},1)=20+23=43=35
finally the minimum cost path is 35.
And the minimum path in the graph is 1—2-4-3-1.
Hamiltonian path cycle
• Let G=(V,E) be a connected graph with n vertices. A Hamiltonian
path in a given graph G that contains every vertex of G is called a
Hamiltonian Path of G.

• A Hamiltonian is a round trip path along n edges of G that contains


every vertex exactly once and returns to the starting position. In
other words if a Hamiltonian cycle begins at some vertex v1 ∑ G and
the vertices of G are visited in the order v1, v2,…vn+1 then the edges
(Vi, Vi+1) are in E 1≤i≤n and the vi are distinct except for v1, and
vn+1 which are equal.

• Simply A Hamiltonian cycle is a tour which contains every node once


Backtracking
• Backtracking is a technique used to solve problems with a large search
space, by systematically trying and eliminating possibilities.

• A strategy for guessing at a solution and backing up when an impasse


is reached.

• If the current issue cannot be resolved, the step is backtracked and the
next possible solution is applied to previous steps, and then proceeds
further. In fact, one of the key things in backtracking is recursion.

• A backtracking algorithm ends when there are no more solutions to the


first sub-problem.

• Most common problems that can be solved by using backtracking is n-


n-Queens problem
• General techniques
In a working solution, exactly 1 queen must appear in each
row, in each column and diagonals.
Simply Finding a configuration of n queens not attacking each
other
Algorithm:
- Start with one queen at the first column first row
- Continue with second queen from proper place i.e. it can not
exist in the same row, column and diagonal.
- Go up until find a permissible situation
- Continue with next queen
- If you reach an impasse, backtrack to the previous column
Eight-Queens Problem
• Place eight queens on the chessboard so that no queen can
attack any other queen
• A recursive algorithm that places a queen in a column
• Base case
• If there are no more columns to consider
• You are finished
• Recursive step
• If you successfully place a queen in the current
column
• Consider the next column
• If you cannot place a queen in the current column
• You need to backtrack
Example
• One solution of eight queens problem
Graph colouring
• Graph Coloring Problem: Given a graph, color all the
vertices so that two adjacent vertices get different colors.
• Objective: use minimum number of colors.
• E.g.

The chromatic number is the smallest such that has proper


-coloring. is called -chromatic.
Vertex colouring
• A -coloring of a graph is a labeling .

• A coloring is proper if no two vertices and connected with


an edge have same color, i.e. .

• is -colorable if it has proper -coloring.


Optimal Coloring
Simple Cycles

 (Ceven ) = 2
 (Codd ) = 3
Complete Graphs

 (Kn ) = n
CONT…
Wheels

 (Wodd ) = 4

 (Weven ) = 3
W5
Edge colouring
• A -edge-coloring of a graph is a labeling .
• Edge coloring partitions into sets (some possibly empty) .
• An edge coloring is proper if adjacent edges have different
colors. All coloring henceforth are assumed proper.
• is -edge-colorable if it has -edge-coloring.
• The edge chromatic number is the smallest such that has -
edge-coloring. is called -edge-chromatic.
• E.g.
Reading assignment
• Optimal binary tree search
• Disconnected graph
D ! !
E N

56

You might also like