Chapter 4
Chapter 4
Chapter 4
The Greedy Approach
2
Greedy Approach
Initially the solution set =
Repeat
Add an item to the solution set
Until the set represents a solution to that instance
4
Greedy Algorithm
Selection procedure: Choose the next item to add to
the solution set according to the greedy criterion
satisfying the locally optimal consideration
Feasibility Check: Determine if the new set is feasible
by determining if it is possible to complete this set to
provide a solution to the problem instance
Solution Check: Determine whether the new set
produced is a solution to the problem instance.
6
Greedy algorithm
Greedy algorithm
v1 1 v2 v1 1 v2
3 6 3 6
3
4 4
v3 v4 v3 v4
2 5 2 5
v5 v5
v1 1 v2 v1 1 v2
3 6 3
v4 4
v3 v3 v4
5 2
v5 v5
Let G = (V , E)
Let T be a spanning tree for G: T = (V, F) where F E
Find T such that the sum of the weights of the edges
in F is minimal
19
Prim’s Algorithm
Kruskal’s Algorithm
Each uses a different locally optimal property
Must prove each algorithm
20
21
3 3 3
6 3 6 3 6
3
4 4 v3 4
v3 v4 v3 v4 v4
5 2 5 2 5
2 v5 v5
v5
v3 4
4 4 v4
v3 v4 v3 v4
2 5
2 5 2 5 v5
v5 v5
24
v1 1 v2
1 2 3 4 5
1 0 1 3
3 6
2 1 0 3 6 3
3 3 3 0 4 2 4
v3 v4
4 6 4 0 5
5 2 5 0
2 5
v5
Y 25
vk
vi
vj
minimum
vm
nearest [ i ] =m
distance [ i ] = weight on edge vi vm
26
Algorithm 4.1
Prim’s Algorithm
Problem: Determine a minimum spanning tree.
Inputs: integer n ≥ 2, and a connected, weighted, undirected
graph containing n vertices. The graph is represented by a
two-dimensional array W indexed (1..n,1..n)
where W [ i ] [ j ] is the weight on the edge between the ith
vertex and the jth vertex.
Outputs: set of edges F in a minimum spanning tree for the
graph.
28
2 5
v5
Step 2
for: min=3, vnear=3; add edge v1 v3 Y v1
1 v2
dist [2]=-1; dist [3]=-1;
Update: dist [4]= 4; dist [5]= 2; 3 3 6
nrt [2]=1; nrt[3]=1; nrt[4]=3; nrt[5]=3; 4
v3 v4
2 5
v5
30
Y v1
1 v2
3 3 6
Step 2
for: min=3, vnear=3; add edge v1 v3 4
v3 v4
dist [2]=-1; dist [3]=-1;
Update: dist [4]= 4; dist [5]= 2; 2
nrt [2]=1; nrt[3]=1; nrt[4]=3; nrt[5]=3; 5
v5
Y v1
1 v2
Step 3
min=2, vnear=5; add edge F={v5 v3 } 3 3 6
dist [2]=-1; dist [3]=-1; dist [4]= 4; dist [5]= -1; 4
v3 v4
nrt [2]=1; nrt[3]=1; nrt[4]=3; nrt[5]=3;
2 5
v5
Step 4 Y v1
1 v2
min=4, vnear=4; add edge F={v4 v3 }
dist [4]= -1; 3
3 6
4
v3 v4
2 5
v5
31
The subset {(v1, v2) , (v1, v3)} is promising, and the subset
{(v2, v4)} is not promising.
Lemma 4.1
Let G = (V , E) be a connected, weighted, undirected graph; let
F be a promising subset of E; and let Y be the set of vertices
connected by the edges in F. If e is an edge of minimum
weight that connects a vertex in Y to a vertex in V − Y , then F
{e} is promising.
33
34
Proof:
We use induction to show that the set F is promising after each iteration
of the repeat loop.
Induction base: Clearly the empty set ∅ is promising.
By the induction proof, the final set of edges is promising. Because this
set consists of the edges in a spanning tree, that tree must be a minimum
spanning tree.
4.1.2 Kruskal’s Minimum Spanning Tree Algorithm 35
1 V2 V1V2 1
V1
V3V5 2
V1V3 3
3 V2V3 3
6
3 V3V4 4
V4V5 5
V3 4 V2V4 6
V4
Disjoint sets
2 {V1}, {V2}, {V3}, {V4}, {V5 }
5
F=
V5
37
1 V2 V1V2 1
V1
V3V5 2
V1V3 3
3 V2V3 3
6
3 V3V4 4
V4V5 5
V3 4 V2V4 6
V4
Disjoint sets
2 {V1, V2}, {V3}, {V4}, {V5 }
5
F={V1V2}
V5
38
1 V2 V1V2 1
V1
V3V5 2
V1V3 3
3 V2V3 3
6
3 V3V4 4
V4V5 5
V3 4 V2V4 6
V4
Disjoint sets
2 {V1, V2}, {V3, V5}, {V4 }
5
F={V1V2 , V3V5}
V5
39
1 V2 V1V2 1
V1
V3V5 2
V1V3 3
3 V2V3 3
6
3 V3V4 4
V4V5 5
V3 4 V2V4 6
V4
Disjoint sets
2 {V1, V2, V3, V5}, {V4 }
5
F={V1V2 , V3V5 ,V1V3 }
V5
40
1 V2 V1V2 1
V1
V3V5 2
V1V3 3
3 V2V3 3
6
3 V3V4 4
V4V5 5
V3 4 V2V4 6
V4
Disjoint sets
2 {V1, V2, V3, V5, V4 }
5
F={V1V2, V3V5 , V1V3, V3V4}
V5
Algorithm 4.2 Kruskal 41
3. while loop:
In the worst case, every edge is considered before the while loop is
exited, which means there are m passes through the loop
p = find(i) sets p to point at the set containing index i, find θ(lg m)
merge(p,q) merges 2 sets into 1 set, merge ϵ θ(c) , c is a constant
equal(p,q) where p and q point to sets returns true p and q point to
the same set, equal ϵ θ(c) where c is a constant
43
Let
G = (V, E) be a connected, weighted, undirected
graph
F is a promising subset of E
Let e be an edge of minimum weight in E – F
F {e} has no cycles
F {e} is promising
Proof of Lemma 4.2 is similar to proof of Lemma
4.1
46
Theorem 4.2
Kruskal’s Algorithm always produces a minimum
spanning tree
47
Prim vs Kruskal
Prim’s Algorithm: T(n) Θ(n2)
Kruskal’s Algorithm: W (m, n) Θ(m lg m)
n-1<=m<=(n)(n-1)/2
Sparse graph
m close to n – 1
Kruskal θ(n lg n)
Kruskal’s faster than Prim
Highly connected graph
Kruskal θ(n2 lg n)
Prim’s faster than kruskal
Huffman Code 49
Prefix Codes
In a prefix code no codeword for one character constitutes
the beginning of the codeword for another character
For example, if 01 is the code word for ‘a’, then 011 could not
be the codeword for ‘b’.
Prefix Codes
55
Example
Let’s compute the number of bits for each encoding:
56
57
Huffman’s Algorithm
Huffman developed a greedy algorithm that produces an
optimal binary character code by constructing a binary tree
corresponding to an optimal code. A code produced by this
algorithm is called a Huffman code.
Huffman’s Algorithm 59
Huffman’s Algorithm 60
61
62
Huffman’s Algorithm
If a priority queue is implemented as a heap, it can be
initialized in θ(n) time. Furthermore, each heap operation
requires θ(lg n) time. Since there are n−1 passes through the
for-i loop, the algorithm runs in θ(n lg n) time.