0% found this document useful (0 votes)
6 views

Chapter03 - Greedy Method

The document discusses greedy algorithms and their applications. It describes the greedy method as constructing solutions to optimization problems in a step-by-step manner by making locally optimal choices at each step. Some applications where greedy algorithms yield optimal solutions include minimum spanning trees, single-source shortest paths, and change making. The knapsack problem is discussed as an example where greedy algorithms provide approximations. Prim's and Kruskal's algorithms are presented as approaches for finding minimum spanning trees.

Uploaded by

Yenatu Lij Baye
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Chapter03 - Greedy Method

The document discusses greedy algorithms and their applications. It describes the greedy method as constructing solutions to optimization problems in a step-by-step manner by making locally optimal choices at each step. Some applications where greedy algorithms yield optimal solutions include minimum spanning trees, single-source shortest paths, and change making. The knapsack problem is discussed as an example where greedy algorithms provide approximations. Prim's and Kruskal's algorithms are presented as approaches for finding minimum spanning trees.

Uploaded by

Yenatu Lij Baye
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

Analysis of Algorithms

Chapter 03
Greedy Methods

1
Outline
 Greedy method

 Applications
 Knapsack problem
 Minimum cost spanning trees
 Single source shortest path problem
 Job sequencing with deadlines

2
Greedy Method
 Greedy design technique is primarily used in Optimization
problems.
 Optimization problems are problems where in we would like to find the
best of all possible solutions.
 In other words we need to find the solution which has the optimal
(maximum or minimum) value satisfying the given constraints.

 The Greedy approach helps in constructing a solution for a


problem through a sequence of steps
 each step is considered to be a partial solution.
 the partial solution is extended progressively to get the complete solution

3
Greedy Method
 Constructs a solution to an optimization problem step by step
through a sequence of choices that are:
Defined by an
 feasible, i.e. satisfying the constraints objective function and
a set of constraints
 locally optimal (with respect to some neighborhood definition)

 irrevocable i.e. not altered

 For some problems, it yields a globally optimal solution for every


instance. For most, does not but can be useful for fast
approximations. We are mostly interested in the former case.

4
Applications of the Greedy Strategy
 Optimal solutions:
 change making for “normal” coin denominations

 minimum spanning tree (MST)

 single-source shortest paths

 simple scheduling problems

 Huffman codes

 Approximations/heuristics:
 traveling salesman problem (TSP)

 knapsack problem

 other combinatorial optimization problems

5
Change-Making Problem
Given unlimited amounts of coins of denominations d1 > … > dm ,
give change for amount n with the least number of coins
Q: What are the objective function and constraints?

Example: d1 = 25c, d2 =10c, d3 = 5c, d4 = 1c and n = 48c

Greedy solution: <1, 2, 0, 3>

Greedy solution is
 optimal for any amount and “normal’’ set of denominations

Ex: Prove the greedy algorithm is optimal for the above denominations.

 may not be optimal for arbitrary coin denominations


For example, d1 = 25c, d2 = 10c, d3 = 1c, and n = 30c

6
Knapsack Problem
 There are n different items in a store
 Item xi :
 weighs wi pounds
 worth $ pi
 A thief breaks in
 Can carry up to W pounds in his knapsack
 What should he take to maximize the value of his haul?

?
7
Knapsack Problem
 Each object xi placed into the knapsack(capacity m) subject to
the following constraints to obtain feasible solution.
n
 maximize
px
i 1
(maximize the profit)
i i

n
 Subject to w x
i 1
i i
≤ w (capacity can’t exceed w)

 The objective of the knapsack problem is try to maximize the


profit by placing the objects into the knapsack.

8
Knapsack Problem Types
 0-1 Knapsack Problem:
 the items cannot be divided

 thief must take entire item or leave it behind

(0 – item not added; 1- item added to the knapsack)

 Fractional Knapsack Problem:


 thief can take partial items

 for instance, items are liquids or powders

 solvable with a greedy algorithm…

9
Fractional Knapsack Problem - Example
Consider the following instance of the knapsack problem: n=3, m=20;
(p1, p2, p3) = (25,24,15) & (w1, w2, w3) = (18,15,10)

Method 1: Arrange the items in increasing order of their weights i.e. fill the
knapsack by an object with minimum weight first.
Item Profit Weight Remainin Object
g capacity Selected
x3 15 10 20-10=10 x3=1
x2 24 15 10-10=0 x2=2/3 Rem. Capacity/weight of
x2 = 10/15 = 2/3
x1 25 18 0 x1=0

Feasible Solutionnis {x1, x2, x3} ={ 0, 2/3, 1}


Total Profit =
px
i 1
i i  p1x1  p 2 x 2  p 3 x 3

2
 25(0)  24( )  15(1)
3
 16  15
 31
10
Example – Contd.
Method 2: Arrange the profits of all items in decreasing order

Item Profit Weight Remainin Object


g capacity Selected
x1 25 18 20-18= 2 x1=1
x2 24 15 2-2 =0 x2 = 2/15
x3 15 10 0 x3=0

Feasible Solution is { 1,2/15,0}


n

Total Profit = px


i 1
i i  p1x1  p 2 x 2  p 3 x 3

2
 25(1)  24( )  15(0)
15
48
 25 
15
 28.2

11
Example – Contd.
Method 3: Arrange items in decreasing order of their profit-weight ratio

Item Pi/wi Profit Weight Remaining Object x1= P1/w1 = 25/18=1.3


capacity Selected x2 = p2/w2 =24/15=1.6
x3= p3/w3= 15/10= 1.5
x2 1.6 24 15 20-15= 5 x2=1

x3 1.5 15 10 5-5 =0 x3 = 1/2

x1 1.3 25 18 0 x1=0

Feasible Solution is { 0, 1, 1/2}


n

Total Profit = px


i 1
i i  p1x1  p 2 x 2  p 3 x 3

1
 25(0)  24(1)  15( )
2
 24  7.5
 31.5
12
Minimum Spanning Tree (MST)
 Spanning tree of a connected graph G: a connected acyclic
subgraph of G that includes all of G’s vertices

 Minimum spanning tree of a weighted, connected graph G: a


spanning tree of G of the minimum total weight

 Example:

6 c 6 c c
a a a
1 4 1 1
4
2 2
d d d
b b b
3 3
A Graph G Possible spanning tree minimum spanning tree
13
Contd.
Applications
 Useful in constructing networks

 Used to find airline routes

 Used to approximately solve travelling salesman problem

Approaches:
 Prim’s Algorithm

 Kruskal’s Algorithms

14
Prim's algorithm
 In computer science, Prim's algorithm (also known as Jarník's
algorithm) is a greedy algorithm that finds a minimum spanning
tree for a weighted undirected graph.

 This means it finds a subset of the edges that forms a tree that
includes every vertex, where the total weight of all the edges in
the tree is minimized.

 The algorithm operates by building this tree one vertex at a


time, from an arbitrary starting vertex, at each step adding the
cheapest possible connection from the tree to another vertex.

15
Prim’s Algorithm for Finding an MST
Step 1: x  V, Let A = {x}, B = V - {x}.

Step 2: Select (u, v)  E, u  A, v  B such that (u, v) has the


smallest weight between A and B.

Step 3: Put (u, v) into the tree. A = A  {v}, B = B - {v}

Step 4: If B = , stop; otherwise, go to Step 2.

16
An Example for Prim’s Algorithm

17
Example2

4 c 4 c 4 c
a a a
1 1 1
6 6 6
2 2 2
d d d
b 3 b 3 b 3

4 c
4 c
a
a 1
1 6
6
2
2
d
d b 3
b 3
18
Prim’s algorithm - Performance
 Needs priority queue for locating closest fringe vertex.

 Efficiency
2
 O(n ) for weight matrix representation of graph and array

implementation of priority queue


 O(m log n) for adjacency lists representation of graph with n

vertices and m edges and min-heap implementation of the


priority queue

19
Kruskal’s Algorithm
 These algorithms find the minimum spanning forest in a possibly
disconnected graph; in contrast, the most basic form of Prim's
algorithm only finds minimum spanning trees in connected
graphs.
 However, running Prim's algorithm separately for each connected
component of the graph, it can also be used to find the minimum
spanning forest.
 In terms of their asymptotic time complexity, these three
algorithms are equally fast for sparse graphs, but slower than
other more sophisticated algorithms.
 However, for graphs that are sufficiently dense, Prim's algorithm
can be made to run in linear time, meeting or improving the time
bounds for other algorithms

20
Kruskal’s MST Algorithm
 Sort the edges in nondecreasing order of lengths

 “Grow” tree one edge at a time to produce MST through a


series of expanding forests F1, F2, …, Fn-1

 On each iteration, add the next edge on the sorted list unless this
would create a cycle. (If it would, skip the edge.)

21
Kruskal’s Algorithm for Finding MST
Input : A weighted, connected and undirected graph
G = (V, E ).
Output : A minimal spanning tree for G.
T:=
While T contains less than n - 1 edges do
Begin
Choose an edge (v, w) from E of the smallest weight
Delete (v, w) from E
If (the adding of (v, w) to T does not create a cycle in T) then
Add (v, w) to T
Else
Discard (v, w)
End

22
Example 1

23
Example2

4 c 4 c 4 c
a a a
1 1 1
6 6 6
2 2 2
d d d
b 3 b 3 b 3

4 c c c
a a a
1 1 1
6 6
2 2 2
d d d
b 3 b 3 b 3
24
Kruskal’s algorithm - Performance
 Algorithm looks easier than Prim’s but is harder to implement
(checking for cycles!)

 Cycle checking: a cycle is created iff added edge connects


vertices in the same connected component

 Runs in O(m log m) time, with m = |E|. The time is mostly


spent on sorting.

25
Single-Source Shortest Paths (SSSP)
Single Source Shortest Paths Problem: Given a weighted connected (directed)
graph G, find shortest paths from source vertex s to each of the other
vertices

Dijkstra’s algorithm: Similar to Prim’s MST algorithm, with a different way of


computing numerical labels: Among vertices not already in the tree, it finds
vertex u with the smallest sum
dv + w(v,u)
where
v is a vertex for which shortest path has been already found on preceding
iterations (such vertices form a tree rooted at s) dv is the length of the
shortest path from source s to v
w(v,u) is the length (weight) of edge from v to u

26
4
b c
3 6
Example a 7 2
d
5 4
e

Tree vertices Remaining vertices


4
b c
a(-,0) b(a,3) c(-,∞) d(a,7) e(-,∞) 3 6
2 5
a d e
7 4
4
b(a,3) c(b,3+4) d(b,3+2) e(-,∞) b c
3 6
2 5
a d e
7 4

4
d(b,5) c(b,7) e(d,5+4) b c
3 6
2 5
a d e
7 4
4
c(b,7) e(d,9) b c
3 6
2 5
a d e
7 4
e(d,9)

27
Dijkstra’s algorithm - Performance
 Applicable to both undirected and directed graphs

 Efficiency
2
 O(|V| ) for graphs represented by weight matrix and array

implementation of priority queue


 O(|E|log|V|) for graphs represented by adj. lists and min-

heap implementation of priority queue

28
Job Sequencing
 A set of jobs: n
 Each job has dead line : d
 Each job ‘i’ has profit : p
 Objective: maximize profit :∑ pi
 i=j
 Job that have same deadline with maximum job will be rejected.
 Example: n=4,(p1,p2,p3,p4)=(100,10,15,27)
 (d1,d2,d3,d4)=(2,1,2,1), Answer: max profit = 127.

s/no Feasible sol/n Process sequence ∑ pi


1 J={ø} - 0
2 J={1} 1 100
3 J={1,4} 4,1 127
4 J={1,2} 2,1 110
6 J={2,3} 2,3 25
7 J={3,4} 4,3 42
8 J={3} 3 15
9 J={4} 4 27
29
Self-Review Questions
1. Explain Greedy Method with one example.

2. Consider the following knapsack problem with n=3 and m=105. The
profits are (p1,p2,p3)= (20, 15, 15), then the corresponding weights
are (100,10,10). Obtain feasible solution and find optimal solution.

3. Given that n=7,profit(p1,p2,p3,p4,p5,p6,p7) = (3,5,20,18,1,6,30)


and deadline(d1,d2,d3,d4,d5,d6,d7) = (1,3,4,3,2,1,2). Solve the
problem by using Job sequencing method.

30
Solution for Question no3 Answer = 74 the maximum profit.
S/no Feasible sol/n Proces sequence ∑ Pi , i=J
1 J={ø} - 0
2 J={7} 2 30
3 J={7,3} 7,3 50
4 J={7,3,4} 7,4,3 68
5 J={7,3,4,6} 6,7,4,3 74
6 J={3} 4 20
7 J={4} 3 18
8 J={6} 1 6
9 J={2} 3 5
10 J={1} 1 3

31

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy