4 Sorting Types - Last Third of Class

¡Supera tus tareas y exámenes ahora con Quizwiz!

What is a comparison-based sorting algorithms? Which ones (that we discussed) are?

A comparison sort reads through the array using comparison (like <=) to determine which one is first. All of them are: Bubble, Heap, Merge, Quick

What are some sorting applications?

Almost all shopping websites have sorting as a core fxn Searching (one of the most important applications) deleting duplicates

What is a drawback of merge sort?

Auxiliary array is used! (because there is a temp array created by the merge helper function, the temp array is created and destroyed during processing) Auxiliary arrays use (n/2) + (n/2) = n space Space complexity O(n)

How does Quick Sort use the "divide and conquer" idea? What is another name for quick sort?

Divide step: choose pivot item p and partition array into two parts - items in S1 are <= p - items in S2 >p - then ordering property (where p item is placed between S1 and S2 - recursively sort the two parts Conquer step: do nothing! No merging needed - distinct from merge sort also known as "Partition and Exchange" Sort note that Bubble sort alternative name is just "Exchange" sort

Bubble Sort used to be called what?

Exchange Sorting

Walk through the steps of bubble sort. Why is it one of the simplest but inefficient sorting algorithms?

Idea: move through an array and rearrange into ascending order one # at a time. 1st pass, largest element guaranteed to be at end 2nd pass, last two elements guaranteed to be in order for(pass=0 to pass<size-1) for(i=0 to i<size-pass-1) if (A[i] > A[i+1]) swap A[i] and A[i+1]

Time complexity of heap sort?

O(nlogn) guaranteed no matter what array looks like Step 1: construct heap - O(logn) for n nodes O(nlogn) + Step 2: delete max - O(logn) for n nodes O(nlogn) + Step 3: linear scan to read back into original array O(n) Big O is O(nlogn)

How can you improve Quick Sort to avoid the worst case of O(n^2), where you have to partition at every element.

Randomized Quick Sort - You could choose p to not be at a fixed position, randomize it somehow. Random sampling: - select random element in subarray as the pivot expected run-time O(nlogn)

What is a stable sort? Which ones (that we discussed) are?

Stable sort sorts equal elements in the same order that they appear in the input Bubble - yes, if two elements are equal there is no swap Heap - Using max heap no because of when temp array is read backwards into original array, using min heap yes Merge - yes Quick: no, if an element is equal to pivot it's placed in S1, swapping it

Describe merge algorithm. Max comparisons for merge algorithm on two sorted arrays? Max assignments? O(n)?

Use 2 flags/cursors to track elements in each array. If the two arrays are size n and m, Temp array size is n+m cursors compare there elements and add to temp array accordingly Worst case comparisons = (n+m)-1 (-1) because the very last element does not need to be sorted Assignments = 2(m+n) assign (m+n) times for original arrays to temp assign (m+n) times for temp to original So, merge algorithm runs in linear time (based on original array size) - this n is not related to above n O(n)

Explain Quick Sort time complexity, including worst case

Worst case is when initial array is in increasing order - each partition takes linear time n, and for this case you need to partition at every element Worst case O(n^2) Best case is when each partition splits the array in half - # partitions is equal to # levels of this "tree" -> logn - each partition takes linear time -> n Quick Sort Best case O(nlogn) General Case is also O(nlogn)

Can you change the value of mid in your recursive calls to mergeSort? Ex: mergeSort(a, start, start+1); mergeSort(a, start+2, end);

You can, but it takes more time (You cannot guarantee big O nlogn). Mid takes the least time.

Merge Sort Psuedocode and time complexity

mergeSort(arr[], int start, int end) if (start < end) int mid = (start+end) /2; //Step 1 mergeSort(arr, start, mid); //Step 2 mergeSort(arr, mid+1, end); merge(arr, start, mid, end); //Step 3 Step 1: O(1) Step 2: O(nlogn) - splitting n elements in half recursively until 1 element in each subarray creates a tree. We need to get to all of the 1's, so n times the height of the tree Step 3: O(n) - read through subarrays to sort mergeSort O(nlogn) guaranteed for any array

Max # of comparisons for bubble sort with n elements?

n(n-1) /2 Gaussian Summation Law for (i=0 to i<n) for (j=0 to j<n-i) if() swap max case of comparisons (inner loop) S = (n-1) + (n-2) + (n-3) + ... 1 so max number of comparisons is sum of all numbers [1, n) Sum = 1 + 2 + 3 + 4 ... n-1 Sum = n-1 + n-2 + n-3 + ... 1 2S = n + n + n + ... n (n-1 times) That sum equals n(n-1) /2 This is T(n). Which is why Big O for bubble sort is always O(n^2).

What are the steps of a "divide and conquer" algorithm? Which sorting algorithms that we talked about are divide and conquer ones?

1) divide/break 2) conquer/solve 3) merge/combine Merge Sort and Quick Sort are divide and conquer

Steps of heap sort?

1. Construct max heap out of array 2. Continuously do delete max and put max's into temp array 3. Read temp array backwards into original array


Conjuntos de estudio relacionados

Phase Changes Assignment and quiz

View Set

Ch 18: The Cardiovascular System II: The Blood Vessels Group 1 - Section 18_1-18_3

View Set

MISY 5325 MS Excel Midterm Exam Study Guide

View Set

Computer in Society: Chapter 9 Final Exam Review

View Set

Chapter 14 Exam II Test Bank Part 1 (C)

View Set