0% found this document useful (0 votes)
8 views

BNP Unit-5 Lecture 21

The document discusses approximation algorithms and randomized algorithms, focusing on their definitions, performance ratios, and specific examples such as the vertex-cover problem and randomized quicksort. Approximation algorithms aim to provide near-optimal solutions for NP-hard problems within polynomial time, while randomized algorithms utilize randomness to enhance performance. The document also covers the analysis of quicksort's worst-case and average-case running times.

Uploaded by

aniketpsingh2004
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

BNP Unit-5 Lecture 21

The document discusses approximation algorithms and randomized algorithms, focusing on their definitions, performance ratios, and specific examples such as the vertex-cover problem and randomized quicksort. Approximation algorithms aim to provide near-optimal solutions for NP-hard problems within polynomial time, while randomized algorithms utilize randomness to enhance performance. The document also covers the analysis of quicksort's worst-case and average-case running times.

Uploaded by

aniketpsingh2004
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 22

Design & Analysis of Algorithms

(KCS-503)
Unit-5
Selected topics
Course Outline:-
⮚ Approximation Algorithm
⮚ Randomized Algorithm
Approximation Algorithm

An Approximation Algorithm is a way of approach NP-COMPLETENESS for the


optimization problem. This technique does not guarantee the best solution. The goal of an
approximation algorithm is to come as close as possible to the optimum value in a
reasonable amount of time which is at the most polynomial time. Such algorithms are
called approximation algorithm or heuristic algorithm..
• For the traveling salesperson problem, the optimization problem is to find the shortest
cycle, and the approximation problem is to find a short cycle.
• For the vertex cover problem, the optimization problem is to find the vertex cover with
fewest vertices, and the approximation problem is to find the vertex cover with few
vertices.
Performance Ratios

• Suppose we work on an optimization problem where every solution carries a


cost. An Approximate Algorithm returns a legal solution, but the cost of that
legal solution may not be optimal.
• For Example, suppose we are considering for a minimum size vertex-cover
(VC). An approximate algorithm returns a VC for us, but the size (cost) may
not be minimized.
• Another Example is we are considering for a maximum size Independent set
(IS). An approximate Algorithm returns an IS for us, but the size (cost) may
not be maximum. Let C be the cost of the solution returned by an
approximate algorithm, and C* is the cost of the optimal solution.
• We say the approximate algorithm has an approximate ratio P (n) for an input
size n, where
Performance Ratios

• This definition applies for both minimization and maximization problems. For a
maximization problem, 0 < C ≤ C*, and the ratio C*/C gives the factor by which the
cost of an optimal solution is larger than the cost of the approximate solution.
Similarly, for a minimization problem, 0 < C* ≤ C, and the ratio C/C* gives the
factor by which the cost of the approximate solution is larger than the cost of an
optimal solution. Since all solutions are assumed to have positive cost, these ratios
are always well defined. The ratio bound of an approximation algorithm is never less
than 1, since C/C* < 1 implies C*/C > 1. An optimal algorithm has ratio bound 1,
and an approximation algorithm with a large ratio bound may return a solution that is
very much worse than optimal.
Approximation scheme

• An approximation scheme for an optimization problem is an approximation


algorithm that takes as input not only an instance of the problem, but also a
value ϵ > 0 such that for any fixed ϵ, the scheme is an approximation algorithm
with relative error bound ϵ. We say that an approximation scheme is a
polynomial-time approximation scheme if for any fixed ϵ > 0, the scheme runs
in time polynomial in the size n of its input instance.
• We say that an approximation scheme is a fully polynomial-time approximation
scheme if its running time is polynomial both in 1/ ϵ and in the size n of the
input instance, where ϵ is the relative error bound for the scheme. For example,
the scheme might have a running time of (1/ ϵ)2n3. With such a scheme, any
constant-factor decrease in ϵ can be achieved with a corresponding constant-
factor increase in the running time.
The vertex-cover problem

A vertex cover of an undirected graph G = (V,E) is a subset V’⊆ V such that if


(u,v) is an edge of G, then either u ϵ V' or v ϵ V' (or both). The size of a vertex
cover is the number of vertices in it.
The vertex-cover problem is to find a vertex cover of minimum size in a given
undirected graph. We call such a vertex cover an optimal vertex cover.
This problem is NP-hard, since the related decision problem is NP-complete.
Even though it may be difficult to find an optimal vertex cover in a graph G,
however, it is not too hard to find a vertex cover that is near-optimal.
Algorithm

The following approximation algorithm takes as input an undirected


graph G and returns a vertex cover whose size is guaranteed to be no
more than twice the size of an optimal vertex cover.
Example
Randomized algorithms

• A randomized algorithm is an algorithm that employs a degree of


randomness as part of its logic.
• The algorithm typically uses uniformly random bits as an
auxiliary input to guide its behavior, in the hope of achieving
good performance in the "average case" over all possible choices
of random bits.
• Formally, the algorithm's performance will be a random variable
determined by the random bits; thus either the running time, or
the output (or both) are random variables.
Randomized algorithms

• One has to distinguish between algorithms that use the random


input to reduce the expected running time or memory usage
• But always terminate with a correct result (Las Vegas algorithms) in
a bounded amount of time,
• And probabilistic algorithms, which, depending on the random
input, have a chance of producing an incorrect result (Monte Carlo
algorithms) or fail to produce a result either by signaling a failure or
failing to terminate.
A randomized version of quicksort

• Instead of always using A[r] as the pivot, we will use a randomly chosen
element from the subarray A[p…..r]. We do so by exchanging element
A[r] with an element chosen at random from A[p….r]. This modification,
in which we randomly sample the range p,...,r, ensures that the pivot
element x = A[r] is equally likely to be any of the r - p + 1 elements in the
subarray. Because the pivot element is randomly chosen, we expect the
split of the input array to be reasonably well balanced on average.

• The changes to PARTITION and QUICKSORT are small. In the new


partition procedure, we simply implement the swap before actually
partitioning:
A randomized version of quicksort

RANDOMIZED-PARTITION(A, p, r)
1 i ← RANDOM(p, r)
2 exchange A[r] ↔ A[i]
3 return PARTITION(A, p, r)

RANDOMIZED-QUICKSORT(A, p, r)
1 if p < r
2 then q ← RANDOMIZED-PARTITION(A, p, r)
3 RANDOMIZED-QUICKSORT(A, p, q - 1)
4 RANDOMIZED-QUICKSORT(A, q + 1, r)
Partitioning the array

PARTITION(A, p, r)
1 x ← A[r]
2i←p-1
3 for j ← p to r - 1
4 do if A[j] ≤ x
5 then i ← i + 1
6 exchange A[i] ↔ A[j]
7 exchange A[i + 1] ↔ A[r]
8 return i + 1
Formal Worst-Case Analysis of Quicksort

• T(n) = worst-case running time


T(n) = max (T(q) + T(n-q)) + Θ(n)
1 ≤ q ≤ n-1
• Use substitution method to show that the running time
of Quicksort is O(n2)
• Guess T(n) = O(n2)
• Induction goal: T(n) ≤ cn2
• Induction hypothesis: T(k) ≤ ck2 for any k < n
Worst-Case Analysis of Quicksort

• Proof of induction goal:


T(n) ≤ max (cq2 + c(n-q) 2) + Θ(n) =
1 ≤ q ≤ n-1
= c max (q2 + (n-q) 2) + Θ(n)
1 ≤ q ≤ n-1
• The expression q2 + (n-q) 2 achieves a maximum over the range 1 ≤ q
≤ n-1 at one of the endpoints

max (q2 + (n - q) 2) = 12 + (n - 1) 2 = n2 – 2(n – 1)


1 ≤ q ≤ n-1
T(n) ≤ cn2 – 2c(n – 1) + Θ(n)
≤ cn2
Average-Case Analysis of Quicksort

• Let X = total number of comparisons performed in all calls to


PARTITION:

• The total work done over the entire execution of Quicksort is


O(nc+X)=O(n+X)

• Need to estimate E(X)


• Indicator Random Variables
• Given a sample space S and an event A, we define the indicator
random variable I{A} associated with A:
• I{A} = 1 if A occurs
0 if A does not occur
• The expected value of an indicator random variable XA=I{A} is:
E[XA] = Pr {A}
Notation

• Rename the elements of A as z1, z2, . . . , zn, with zi being


the ith smallest element
• Define the set Zij = {zi , zi+1, . . . , zj } the set of elements
between zi and zj, inclusive
Total Number of Comparisons in PARTITION
• Define Xij = I {zi is compared to zj }
• Total number of comparisons X performed by the
algorithm:

i n-1

i+1 n
Expected Number of Total Comparisons in PARTITION

• Compute the expected value of X:

by linearity indicator
of expectation random variable

the expectation of Xij is equal


to the probability of the event
“zi is compared to zj”
Probability of comparing zi with zj

Pr{ zi is compared to zj } =

Pr{ zi is the first pivot chosen from Zij }


+
OR
Pr{ zj is the first pivot chosen from Zij }
= 1/( j - i + 1) + 1/( j - i + 1) = 2/( j - i + 1)

•There are j – i + 1 elements between zi and zj


– Pivot is chosen randomly and independently
– The probability that any particular element is the first
one chosen is 1/( j - i + 1)
Number of Comparisons in PARTITION

Expected number of comparisons in PARTITION:

(set k=j-i) (harmonic series)

⇒ Expected running time of Quicksort using


RANDOMIZED-PARTITION is O(nlgn)
The End

B N Pandey 7/5/2020

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy