Data Structures Quiz 2
throws an exception
After iteration has ended, an iterator's next method: - returns false - returns null - throws an exception - cannot be called
descendants
Any node, X, along with all of X's __, from the subtree rooted at X. - Parents - Siblings - Ancestors - Children - Descendants
Recursive call is the last step in recursive case
Define tail recursion
22 10 3 24 12 7 31
Given the following array elements, what does the array look like after one full pass (all "slices") shellsort, using a gap size of 3? 24 10 7 31 12 3 22
7 10 12 3 22 24 31
Given the following array elements, what does the array look like after two iterations of bubble sort, arranging items in ascending order? 24 10 7 31 12 3 22
7 10 24 31 12 3 22
Given the following array elements, what does the array look like after two iterations of insertion sort, arranging items in ascending order? 24 10 7 31 12 3 22
3 7 10 31 12 24 22
Given the following array elements, what does the array look like after two iterations of selection sort, arranging items in ascending order? 24 10 7 31 12 3 22
1 + max(height(leftChild), height(rightChild))
How can we define the height of a subtree recursively, based on the height of its children? - max(height(leftChild), height(rightChild)) - 1 + min(height(leftChild), height(rightChild)) - 1 + height(leftChild) + height(rightChild) - 1 + max(height(leftChild), height(rightChild))
Reverse sorted array, sorted array
In which scenarios will our simple quicksort (pivot = last element) have worst case runtime? - All scenarios - Reverse sorted array - Random order array - Sorted array
false
T or F: Some problems can only be solved with recursions; it is impossible to write an equivalent iterative algorithm
false
T or F: Some recursive algorithms are challenging to convert to an equivalent iterative version these algorithms are primarily those with multiple recursive cases.
O(n)
Using average-case analysis to determine the average runtime of LinkedList's contains(T entry), assuming each element is equally likely to be the given entry. - O(1) - O(lg n) - O(n) - O(n^2)
O(n)
What is the best case runtime of insertion sort? - O(1) - O(lg n) - O(n) - O(n^2)
O(n^2)
What is the best case runtime of selection sort? - O(1) - O(lg n) - O(n) - O(n^2)
10 12 6 29 30 35 45 42
What is the result of Quicksort's partition routine on the following array? 10 45 35 29 42 6 12 30
O(n)
What is the runtime of ArrayList's add(int, E) method?
O(n)
What is the runtime of merge sort's merge() routine given n elements? -O(lg n) -O(n) -O(n lg n) -O(n^2)
O(n^2)
What is the total number of iterating through a LinkedList using a for() loop?
O(n^2)
What is the worst-case runtime of Quicksort using Median of Threes? - O(n) - O(n lg n) - O(n^2)
add(T entry)
Which method will be asymptotically slower in LinkedList than in ArrayList? - add(T entry) - add(int pos, T entry) - remove(int pos) - contains(T entry)
i-1
Which node do we need to search for in order to remove the element at index i? - i - i-1 - i+1 - All of the above
It is in place, has O(n lg n) average case runtime, and is usually faster than merge sort in practice
Which of the following are true of (practical implementations of) Quicksort? -Has O( n lg n) worst case runtime - Is in place - Is stable - Has O(n lg n) average case runtime - Is usually faster than merge sort in practice
exp(b, n/2)
Which of the following is a subproblem of fractional size that is useful for solving exp(b, n)? - exp(b, n-1) - exp(b-1, n-1) - exp(b/2, n) -exp(b, n/2)
They tackle subproblems of fractional size
Which of the following is the distinctive feature of divide and conquer algorithms? - They make multiple recursive calls - They tackle subproblems of fractional size - They have O(lg n) runtime - They are faster than iterative algorithms
Subproblems are a fractional size of original problem
Which of the following makes divide and conquer algorithms different from other recursion approaches? -Subproblems are a fractional size of original problems -Tackles multiple subproblems to solve the problem -Results in O(log n) runtime - Does not need a base case
add(int pos; T entry) remove(int pos) contains(T entry)
Which of the following methods are linear-time in ArrayList? - add(T entry) - add(int pos; T entry) - remove(int pos) - contains(T entry) - set(int pos, T entry)
Shell sort
Which of the following sorting algorithms cannot be easily made stable? - Insertion sort - Bubble sort - Selection sort - Shell sort
Bubble Sort
Which of the following sorting algorithms is generally slower for a large number of items? - Selection Sort - Bubble Sort
Termination
Which of the recursion requirements does this code fail to satisfy? void foo(int n) { System.out.println("N: " + n); if(n > 0) { foo(n-2); } else if (n < 0) { foo(n+2); } }
Recursion Tree
Which technique would be most useful for analyzing the runtime of a recursive method? -Amortized Analysis -Worst case analysis -Recursion tree -Average case analysis
Elements aren't ordered in Bag
Why didn't we need a removeGap method for ArrayBag?
So that the iterator has access to the collection's private data members
Why do we make the iterator class an inner class of the collection?
So that multiple iterators can exist with different states
Why do we need to create a new class for iterator rather than allowing the collection to implement the operations itself? - To simplify the code for the collection - To compartmentalize code for different purposes - So that multiple iterators can exist with different states