9 Faster Sorting Algorithms
Method to sort an entire array "a" of objects
public static void sort(Object[ ] a)
What happens during the quick sort partitioning process?
Quick sort rearranges the entries in an array during the partitioning process. Each partition places one entry—the pivot—in its correct sorted position. The entries in each of the two subarrays that are before and after the pivot will remain in their respective subarrays.
How does a merge sort work?
The merge sort divides an array into halves, sorts the two halves, and then merges them into one sorted array. *is a "divide and conquer" algorithm
To realize the "comparable" interface
public class Student implements Comparable{ }
During a mergeSort the actual sorting takes place during
the merge steps and not during the recursive calls
Algorithm that will swap array entry "j" with array entry "i" should "i" be greater than "j"
{ if (a[i].compareTo(a[j]) > 0) swap(a, i, j); } // end order /** Swaps the array entries array[i] and array[j]. */ private static void swap(Object[] array, int i, int j) { Object temp = array[i]; array[i] = array[j]; array[j] = temp; } // end swap
Which of the following are true about Merge sort? 1. The worst-case time complexity of Merge sort is O(nlog(n)) 2. The worst-case time complexity of Merge sort is O(n2) 3. The average-case time complexity of Merge sort is O(nlog(n)) 4. The average-case time complexity of Merge sort is O(n2)
1 & 3
Which of the following are true about Quicksort? 1. The worst-case time complexity of Quicksort is O(nlog(n)) 2. The worst-case time complexity of Quicksort is O(n2) 3. The average-case time complexity of Quicksort is O(nlog(n)) 4. The average-case time complexity of Quicksort is O(n2)
2 & 3
Sorting the the first entry, the middle entry, and the last entry, and using the resulting middle entry as the pivot, is known as
Median-of-three pivot selection *Median-of-three pivot selection avoids worst-case performance by quick sort when the given array is already sorted or nearly sorted. While it theoretically does not avoid worst-case performance for other arrays, such performance is unlikely in practice.
In a recursive merge sort the recursive calls divide the array into
n one-entry subarrays
The choice of this affects the quick sorts efficiency
pivots
Recursive call order during a merge sort diagram
*Merge sort rearranges the entries in an array during its merge steps.
Radix sort
- Radix sort does not use comparison, but to work, it must restrict the data that it sorts - Radix sort is O(n) - Begins by grouping the integers according to their rightmost digits, moving from array to "buckets" and back, etc - Integers requires 10 buckets; sorting words requires at least 26 buckets
Merge sort efficiency
In general, if n is 2^k, k levels of recursive calls occur. Since k is log2n, mergeSort is O(n log n). Merge sort is O(n log n) in all cases. Its need for a temporary array is a disadvantage. The time required for copying entries, however, is less in Java than in other programming languages, since references are copied instead of the actual objects.
The most resource intensive aspect of a merge sort is
The real effort during the execution of a merge sort occurs during the merge step
1. If A has only a single element, then it is already sorted, so exit. Otherwise, perform the following steps. 2. Let k = floor(n/2) 3. Recursively execute this algorithm on the sub-arrays A[1..k] and A[k+1 .. n] 4. Let i = 1; 5. Let j = k+1; 6. repeat if A[i] < A[j] then begin print A[i]; i = i+1; end; else begin print A[j]; j = j+1; end; until (i == k) or (j == n); 7. while (i < k) do print A[i]; i = i + 1; end; 8. while (j < k) do print A[j]; j = j + 1; end; 9. exit; This uses which of the following sorting algorithms? A Merge sort B Quicksort C Insertion sort D Select sort
A Merge sort
Functional description of an iterative merge sort
An iterative merge sort starts at the beginning of the array and merges pairs of individual entries to form two-entry subarrays. Then it returns to the beginning of the array and merges pairs of the two-entry subarrays to form four-entry subarrays, and so on.
Consider the following algorithm to sort an array A of size n that has no duplicate elements If A has only a single element, then it is already sorted, so exit. Otherwise, perform the following steps. Select an element a of the array at random Exchange elements of the array so that when done, for some k, A[k] = a, A[i] < a whenever i < k, and A[j] > a whenever j > k Recursively execute this algorithm on the sub-arrays A[1..k-1] and A[k+1 .. n] This algorithm is an example of which of the following sorting algorithms? A Merge sort B Quicksort C Insertion sort D Select sort
B Quicksort
The time efficiency of quick sort
Quick sort is O(n log n) in the average case but O(n2) in the worst case.
When getting the "pivot out of the way" when using median-of-three pivot selection you will sway it with
The next-to-last entry (a[last − 1]) *known as adjusting the partition algorithm
Description of the quick sort function
The quick sort divides an array into two pieces, but unlike merge sort, these pieces are not necessarily halves of the array. Instead, quick sort chooses one entry in the array—called the pivot—and rearranges the array entries so that: - The pivot is in the position that it will occupy in the final sorted array - Entries in positions before the pivot are less than or equal to the pivot - Entries in positions after the pivot are greater than or equal to the pivot This arrangement is called a partition of the array.
T/F The merge sorts in the Java Class Library are stable.
True
If a sorting algorithm does not change the relative order of objects that are equal it is considered
stable *For example, if object x appears before object y in a collection of data, and x.compareTo(y) is zero, a stable sorting algorithm will leave object x before object y after sorting the data.