0% found this document useful (0 votes)
9 views

L9_sorting

The document provides a comprehensive overview of sorting algorithms, including their types, performance, and applications. It covers comparison-based sorting methods like selection sort, insertion sort, bubble sort, mergesort, and quicksort, as well as non-comparative algorithms such as counting sort and radix sort. Each algorithm is analyzed for time complexity, space complexity, and stability, highlighting their practical uses and limitations.

Uploaded by

dingnaikun01
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

L9_sorting

The document provides a comprehensive overview of sorting algorithms, including their types, performance, and applications. It covers comparison-based sorting methods like selection sort, insertion sort, bubble sort, mergesort, and quicksort, as well as non-comparative algorithms such as counting sort and radix sort. Each algorithm is analyzed for time complexity, space complexity, and stability, highlighting their practical uses and limitations.

Uploaded by

dingnaikun01
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 50

MIS 206: Business Data Structures and Algorithms

Sorting

Several pages are based on the notes by Dr. Bo Tang and Dr. Zongwei Luo
Sorting
Sorting Applications

?
Sorting
• Sorting problem
• Input: an array A[1…n] with n integers (characters, strings, etc., as long as
proper ordering can be established between pairs)
• Output: a sorted array A (in ascending or descending order)

• Example: sort A[1…n] in ascending order


• Input: 8 6 1 3 7 2 5 4
• Output: 1 2 3 4 5 6 7 8

• All examples in this lecture will be with integers and in ascending order.
Descending order sorting just needs reversing the comparison signs
Types of Sorting Algorithms
• Comparison-based sorting
• Only uses comparisons (<, =, >, ≤, ≥) between
pairs to determine the final sorted sequence
• Running time lower bounded by Ω(n log n)
• Both worst and average cases
• Both deterministic and randomized algorithms
• Insertion sort, Bubble sort, Selection sort,
Mergesort, Heapsort, Quicksort
• Other sorting algorithms
• Under certain conditions, may run faster than
O(n log n)
• Counting sort, radix sort, bucket sort
Sorting
• For a deck of playing cards, how would you sort it?
Selection Sort
• Idea of selection sort
• Start with empty hand, all cards on table
• Select the smallest card from table
• Insert the card into hand
Selection Sort
Selection Sort
Sort A[1…n] in ascending order:

1. for integer i ← 1 to n-1


2. k←i
3. for integer j ← i+1 to n Cost: n-1+n-2+…
4. if A[k] > A[j] then +1=O(n2)

5. k←j
6. swap A[i] and A[k]
Selection Sort Performance
• Best-case time complexity: O(n2)
• Worst-case time complexity: O(n2)
• Average time complexity: O(n2)
• Space complexity: O(1)
• In-place algorithm, no significant amount of additional space needed
Insertion Sort
• Idea of insertion sort
• For each new card, insert it into the correct position among the existing set of
cards
• Correct position: keep comparing with the previous card, swap if in reverse
order
Insertion Sort
Insertion Sort
Sort A[1…n] in ascending order:

1. for integer i ← 2 to n
2. for integer j ← i downto 1 Cost: 1+2+…+n-
3. if A[ j-1] > A[ j ] then 1=O(n2)

4. swap A[ j-1] and A[ j ]


5. else
6. terminate inner loop
Insertion Sort Performance
• Best-case time complexity: O(n)
• Worst-case time complexity: O(n2)
• Average time complexity: O(n2)
• Space complexity: O(1)

• More efficient in practice than most other simple quadratic (O(n2))


algorithms such as selection sort or bubble sort
Bubble Sort
• Idea of bubble sort
• For each pass:
• Compare the pair of adjacent item
• Swap them if they are in the wrong order
• Repeat the pass through the list until no swap is needed
Bubble Sort
Bubble Sort
• Sort A[1…n] in ascending order:

1. for integer i ← 1 to n-1


2. for integer j ← 1 to n-i Cost: n-1+n-2+…
3. if A[ j ] > A[ j+1] then +1=O(n 2
)

4. swap A[ j ] and A[ j+1]


5. if no swap is made in the j loop, then terminate
Why?
Bubble Sort Performance
• Best-case time complexity: O(n)
• Worst-case time complexity: O(n2)
• Average time complexity: O(n2)
• Space complexity: O(1)

• Similar to selection sort


• …but, requires a lot more swaps than selection sort
• Very limited usage in real world
• E.g. Comparisons/swaps limited to adjacent items
Stability
• A stable sorting algorithm maintains the relative order of records with
equal keys (i.e., values)
• If there are two integers of the same value, whichever was in the front in the
original unsorted array remains in the front in the sorted array

 Which of the following is/are stable?


A. Selection sort B. Insertion sort C. Bubble sort
Heapsort
• Idea: use max/min binary heap
• Construct a max/min binary heap (implemented on array) from input
• In each iteration, conduct delete_max (delete_min) operation on the binary
heap, until it is empty
• To save space, keep a sorted area at the end of the original array
• Swap the root and the last item of the unsorted part, and increase the sorted area by one
Heapsort
Heapsort Performance
• Best-case time complexity: O(n log n)
• Worst-case time complexity: O(n log n)
• Average time complexity: O(n log n)
• Space complexity: O(1)
• Unstable (why?)

• Ususally 2-3 times slower than a well-implemented quicksort


Mergesort
• Idea: divide and conquer
• Divide: divide the problem into smaller subproblems
• Conquer: solve each subproblem recursively
• Combine: combine the solution of subproblems into the solution of the
original problem
Mergesort
Mergesort(A, n):
1. if n>1
2. p←
• Divide: divide the array into 3. B[1..p] ← A[1..p]
two subarrays of n/2 numbers
4. C[1..n-p] ← A[p+1..n]
each
5. Mergesort(B, p)
• Conquer: sort two subarrays
recursively 6. Mergesort(C, n-p)
7. A[1..n] ← Merge(B, p, C, n-p)
• Combine: merge two sorted
subarrays into a sorted array
Mergesort
Mergesort: Combine Phase
Merge(B, p, C, q):
1. n ← p+q
2. Let A[1..n] be a new array Time of Merge: O(n)
3. i ← 1, j ← 1
4. for k ← 1 to n
5. if i ≤ p and ( j > q or B[ i ] ≤ C[ j ] )
6. A[ k ] ← B[ i ]; i ← i+1;
7. else
8. A[ k ] ← C[ j ]; j ← j+1;
9. return A
Mergesort
Mergesort Performance
Mergesort(A, n):
• Let T(n) be the running time of Mergesort 1. if n>1
• Line 3, 4 take O(n) time 2. p←
• Line 5 takes T(n/2) time 3. B[1..p] ← A[1..p]
• Line 6 takes T(n/2) time 4. C[1..n-p] ← A[p+1..n]
• Line 7 takes O(n) time 5. Mergesort(B, p)
• T(1) = O(1) 6. Mergesort(C, n-p)
7. A[1..n] ← Merge(B, p, C, n-p)

• Thus, T(n) = 2T(n/2) + O(n)


General Form of Divide and Conquer Time
Complexity

• Assume the problem is divided into a subproblems, each of size n/b


• a, b could be different
• The combine phase takes O(nc) time
• Then the recurrence equation is T(n) = aT(n/b) + O(nc)
Master Theorem
• T(n) = aT(n/b) + O(nc)

• If log b a < c, then T(n) = O(nc)


• If log b a = c, then T(n) = O(nc log n)
• If log b a > c, then T(n) = O()

• For Mergesort, a = 2, b = 2, c = 1
• So the time complexity for mergesort is O(n log n)
Mergesort Performance
• Best-case time complexity: O(n log n)
• Worst-case time complexity: O(n log n)
• Average time complexity: O(n log n)
• Space complexity: depends on the implementation, normally O(n)
• Stable (why?)
Randomized Algorithms
• So far, all our algorithms are deterministic
• Quicksort, the fastest sorting algorithm in practice, may include
randomization
• Randomized algorithms are not ambiguous – generating an outcome in a
randomized manner is unambiguous
• With randomized algorithms, we talk about expected running time
• Example: if we have n distinct numbers in array A and want to find the
position of one of the numbers c, we can randomly select an integer i
between 1 to n, and see if A[i]=c
• Expected running time: O(n)
A very bad randomized sorting algorithm…
• If we want to sort a deck of n cards, we can throw them into the air
until they fall on the ground exactly sorted
• Obviously extremely inefficient
• Throwing the cards is a randomized operation
• Expected running time: O(n!)
• Bogosort/slowsort/stupid sort
Quicksort
• Idea of quicksort
• (Randomly, or with some rules) pick an integer p in A, and call it pivot
• Rearrange the integers in A such that
• All the integers smaller than p are positioned before p
• All the integers larger than p are positioned after p
• Quicksort the part of A before p recursively
• Quicksort the part of A after p recursively

• Similar to something we learnt before?


Quicksort

13
13 43
43 81
81
31
31 57
57 26
26 65
13
13 81
81 92
92 75
75
00
92
92 4343 65
65
31
31 57
57 26
26
75
75 00
0013
1326
2631
3143 57 65
4357 75
75 81
8192
92

0013
1326
2631
3143
4357
5765
6575
75 81
8192
92
Quicksort
• Quicksort A[1…n], lo = 1, hi = n:

1. if lo < hi then
2. p ← partition(A, lo, hi)
3. Quicksort(A, lo, p-1)
4. Quicksort(A, p+1, hi)
Quicksort (Hoare’s scheme) Swap with A[lo] also fine,
just remember to swap
back.
Assumed basic operation Alternatively, we can just
• Partition(A, lo, hi): which takes constant select the last element
time as pivot
1. p ← SELECTPIVOT(lo, hi), swap A[p] and A[hi]; pivot ← A[hi]
2. L ← lo-1, R ← hi Find a value in left side
3. while true greater than pivot
and a value in right side
4. do L++ while A[L] < pivot smaller than pivot

5. do R-- while A[R] > pivot


6. if L >= R
7. swap A[L] and A[hi]
8. return L
9. swap A[L] and A[R]
Quicksort
Quicksort
Quicksort
• Worst-case: O(n2)
• In what case?
• However, quicksort is fast (O(n log n)) in expectation
• Quicksort often has an advantage over mergesort, not in theory, but in
practice
• Mergesort lends itself better to parallelization though
Quicksort Problems
• Presorted or close-to-presorted sequence may result in bad
performance
• Three point pivoting: take Median(lo, hi, (lo+hi)/2) as pivot
• Faster than random pivot by about 5% in reality

• Most implementations of quicksort are unstable


• Example: 5 2 3 5 1 4, taking 3 as pivot
• If stability is required, implementations often involve O(n) extra space
Quicksort Problems
• Problem: Quicksort is slower than insertion sort for small n
• Solution: Cutoff when n gets small ( e.g. for C++, n = 32 ) and use
other efficient algorithms (such as insertion sort)
• Even in quicksort itself, when the size of subarray is less than a threshold,
using insertion sort directly can be faster
Counting Sort
• Non-comparative sorting algorithm
• Mainly used on small integers
• Idea
• Create pre-sorted “bin”s for the input
• Each bin records the appearances of items with the bin’s key value in the input
• Output the sorted sequence by listing its recorded items
Counting Sort
Counting Sort Example
Suppose that we have n students, each has a grade record in the range 0
to 100 (thus there are k = 101 possible distinct grades). How to sort
them according to their grades in linear time?

Algorithm
{
initialize count[ ];
0 1 88 100 while (read in a student’s record)
count insert to list count[stdnt.grade];
for (i=0; i<k; i++) {
if (count[i])
output list count[i];
}
}
Counting Sort Performance
• Assume that the there are k “bin”s
• Counting sort takes O(n + k) time
• Space complexity O(n + k)

• What happens when k >> n? What does this mean?


Radix Sort
• Another non-comparative sorting algorithm, dating back to 1887
• Can be applied to data that can be sorted lexicographically, be they
integers, words, etc.
• When used on integers, the applicable range is larger than counting
sort
Radix Sort
• Can start from either the most significant digit (MSD) or least
significant digit (LSD)
• Sort values according to their MSD/LSD, then move on to the
next/previous digit
• Since only 10 values (0-9) for each digit, 10 “buckets” are needed, thus
sometimes called bucket sort
Radix Sort
Radix Sort
• Running time: O(nw)
• n: number of values
• w: maximum value digits

• Nowadays, mostly applied to binary strings and integers


• It has been shown in some benchmark to be faster than other more
general-purpose sorting algorithm, sometimes 50% to 3 times faster

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy