CS201 Final Exam
Suppose that we have a circular array such as the one below, with capacity = 8 and size = 4, with the front of the array being the element A at index 2 and the back being the element D at index 5. index 0 1 2 3 4 5 6 7 A B C D Suppose that we perform a delete front, addEnd(E), addEnd(F), addEnd(G). At what index is the element G stored? Suppose we continue from here and perform three delete front operations and an addEnd(H) operation. At what indices are the elements at the [front] and [back] of the array?
1. 0 2. front: 6 back: 1
1. Start with a dynamic array that has storage for 8 elements with 4 elements "in use" by the user of the array. Suppose that the user performs 14 insert operations. Draw the resulting dynamic array and state the total storage space allocated for the dynamic array. 2. Perform 5 delete operations. How much total storage space allocated for the dynamic array now? 3. Now perform 4 insert operations. How much total storage space allocated for the dynamic array now? 4. Lastly, perform 12 delete operations. Draw the final array and state the total storage space allocated for the dynamic array.
1. 32 total with 18 in use 2. 32 storage 13 in use 3. 32 storage 17 in use 4. 16 storage, 5 in use
1. Suppose that we start with a dynamic array that has capacity for 16 elements with 9 elements "in use" by the user of the array. Suppose that the user performs 24 insert operations. The resulting dynamic array should have capacity for how many elements? 2. Continue from the first Dynamic Arrays question, if the user performs 10 delete operations, the resulting dynamic array should have capacity for how many elements? 3. Continuing from the Dynamic Array Delete: Suppose that the user now performs 10 more delete operations. The resulting dynamic array should have capacity for how many elements?
1. 64 2. 64 3. 32
You have a Binomial heap with B4 B2 B1 B0 tree and merge it with binomial tree with B3 B1 B0 What trees will result have? How many nodes?
1. B5 and B1 2. 34 nodes
Red Black Tree Properties
1. Every node is either red or black 2. The root is black 3. Every leaf (NIL) is black 4. If a node is red, then both its children are black 5. All simple paths from node to child leaves contain the same # of black nodes
If a 2-4 tree with one node has height 1, what is the maximum number of keys/values in a valid 2-4 tree of height 5?
1023 as height = log2(n+1) - 1 or 4 ^height - 1
Suppose that the majority element problem starts with the sequence of 1s, 2s and 3s: 1 3 2 2 3 3 1 2 1 3 1 1 1 1 What would the next sequence of values be?
2 3 1 1 idk how
2-4 Tree -> Red-Black Tree conversion
2 node = black w/ 2 black children 3 node = black w/1 red child 4 node = black w/ 2 red children
What is the expected height of a treap of size n = 50, 000 closest to?
2lg(N)
The worst case O(N) time selection algorithm finds its pivot element by finding the median of the medians of groups of size
5
Binary Search Tree
A data structure very similar to a tree with the following additional restrictions. Each node can have only 0, 1 or 2 leaf nodes. All left nodes and all of its descendants have smaller values that the root node, while all right nodes and all of its descendants have larger values than the root node.
Bubble Sort Algorithm
A sorting algorithm that makes multiple passes through the list from front to back, each time exchanging pairs of entries that are out of order
Dijkstra's algorithm
An algorithm for finding the shortest paths between nodes in a weighted graph. For a given source node in the graph, the algorithm finds the shortest path between that node and every other. It can also be used for finding the shortest paths from a single node to a single destination node by stopping the algorithm once the shortest path to the destination node has been determined. Doesn't know how to handle negative weights
Counting Sort
An algorithm for sorting a collection of objects according to keys that are small integers; that is, it is an integer sorting algorithm. It operates by counting the number of objects that have each distinct key value, and using arithmetic on those counts to determine the positions of each key value in the output sequence. Its running time is linear in the number of items and the difference between the maximum and minimum key values, so it is only suitable for direct use in situations where the variation in keys is not significantly greater than the number of items. However, it is often used as a subroutine in another sorting algorithm, radix sort, that can handle larger keys more efficiently.[1][2][3] Because counting sort uses key values as indexes into an array, it is not a comparison sort, and the Ω(n log n) lower bound for comparison sorting does not apply to it. Basically you'd make an array size max + 1 Say you 0 1's 2 2's 1 three 3 4's. You'd just input until each is empty. So you'd input the one and then minus it's count and if count >0 then input it again and count-- and if it count is zero then go to the next one
Amortized Analysis
Applies to worst-case sequences of operations Finds average running time per operation Cost per-operation over a sequence of operations
What's the best algorithim for a directed graph with negatives to find shortest path from V to all other vertices?
Bellman-Ford
Say you implement Dijkstra's Algorithim with a Fibonacci Heap, successful relaxation utilized which operation?
Decrease-key
Counting sort is a comparison based sorting algorithm
False
True/false In 2-4 trees, every time a 4-node is created, it's split
False
Master Theorem
Given T(n) = aT(n/b) + f(n), where a = # subproblems, n/b = size of subproblems, and f(n) = time complexity of subproblems (minus the recurrence part): If f(n) = O(n^(logb(a) - e)) for e > 0, then T(n) = θ(n^(logb(a))) If f(n) = θ(n^(logb(a))) then T(n) = θ(n^(logb(a)) * log(n)) If f(n) = Ω(n^(logb(a) + e) for e > 0 AND af(n/b) <= cf(n) for any constant c then T(n) = θ(f(n)) simpler terms: T(N) = a T(n/b) +d if log b (a) < d, run time = O(n^d) if log b (a) = d, run time = O(n^d log n) if log b (a) > d, run time = (n ^ (log b (a)))
2-4 Tree Insertion
Insert into it's place in order. If Node it's inserting into is a 4 node, bring the middle node up
Dijkstra's time complexity
Its time complexity is O(E + VlogV) for fibonacci heaps, where E is the number of edges and V is the number of vertices. V^2 for unordered array
Does the root list change on merge of Fib-Heap?
No, only on decrease-key and extract-min
radix sort
Non-comparative integer sorting algorithm that sorts data with integer keys by grouping keys by the individual digits which share the same significant position and value. Two classifications of radix sorts are least significant digit (LSD) radix sorts and most significant digit (MSD) radix sorts. First sort by one's place, then tens and so forth. If they're equal don't swap
Red-Black tree Rotation Time complexity
O(1)
Array Runtimes
O(1) Amortized time per operation. • O(N) worst case time for any single operation.
binomial heap run times (wc)
O(1)- Make-Heap + isEmpty O(log N)- Insert, Extract-Min, Decrease-Key, Delete, meld, Find-Min
binary heap run times
O(1)- Make-Heap + isEmpty +FindMin O(log N)- Insert, Extract-Min, Decrease-Key, Delete O(N)- meld
Fibonacci Heap amortized run times
O(1)- Make-Heap + isEmpty, Decrease-Key, Insert, meld, Find-Min O(log N)- Extract-Min, Delete
What is the running time of the following code fragment: for(i=1; i<N; i++) for (k=1; k<=N; k *= 2) z++;
O(N lg N)
Pre-Order/Post-Order/In-Order run times
O(N)
WC time to extract-min on N node Fibonacci heap
O(N)
What is the max number of B0 trees in fib heap with n nodes?
O(N)
What is the running time of the following code fragment: for (k=1; k<=N; k *= 2) for(i=1; i<k; i++) z++;
O(N)
for (k=1; k<=9999*n; k+=10) z++;
O(N)
Median of Medians time complexity
O(N) for everything
linked list heap run times
O(N) for extract-min and Find-Min O(1) for everything else
Given a list of N integers in the range 0...K, what is the running time of counting sort on these values?
O(N+K)
Given a list of N integers in the range 0...N2, what is the running time of counting sort?
O(N^2)
O(N^2 + N) is basically?
O(N^2)
Suppose we are performing the quickselect algorithm, and every time we pick a partition element we are very unlucky, picking the smallest or the largest value remaining in the set. What would the running time of this worst case behavior be?
O(N^2)
What is the solution to the recurrence : T(N) = 8 T(N/4) + O(N^2)
O(N^2)
for (i=1; i<=n*n; i++) z++;
O(N^2)
What is the solution to the recurrence : T(N) = 27 T(N/3) + O(N^3)
O(N^3 lg N)
Worst Case time to merge 2 Binomial Heaps
O(lg N)
What is the Worst Case time to add element to N node binomial heap?
O(lgN)
Binary search runtime
O(log n)
Red-Black tree Height max
O(logN)
Quick Sort Worst Case
O(n^2) In Place: yes Stable: no
Bubble Sort Worst Case
O(n^2), reverse sorted array
Quick Sort Average Case
O(nlogn)
Merge Sort Time Complexity, stable? in-Place?
O(nlogn) in all cases stable- yes in-place- no
Linked lists struggle with?
Ordered Lists
Majority Element Algorithim
Phase 1: Use divide-and-conquer to find candidate value M • Phase 2: Check if M really is a majority element, θ(n) time, simple loop
Red Black Tree violation when insert
Red Uncle: colorflip, Red parent with two black children (red parent will be new nodes grandparent) Black Uncle: rotate, Black parent with two red children
2-4 Tree Deletion
Search the node whose value needs to be deleted. If the node is a leaf node then remove the required value from that node and decrease the data elements by 1. If the node is not a leaf node then:Find the successor of that node. A successor of a node is the smallest element among the ones which are greater than it or the largest element among the ones that are smaller than it.Swap the successor with the current node and delete that node in the leaf.
Merge Sort Algorithm
Sorts an array by cutting the array in half, recursively sorting each half, and then merging the sorted halves
Strassen's algorithim
Strassen's matrix multiplication (MM) has benefits with respect to any (highly tuned) implementations of MM because Strassen's reduces the total number of operations. Strassen algorithm is a recursive method for matrix multiplication where we divide the matrix into 4 sub-matrices of dimensions n/2 x n/2 in each recursive step.
Suppose that the number of recursive calls in Strassen's matrix multiplication algorithm could be reduced from 7 to 6. What would the recurrence relation for this new algorithm be?
T(N) = 6 T(N/2) + O(N^2)
Strassen's Algorothim Reccurrence relation
T(n) = 7 T(n/2) + θ(n2) 7 = # of recursive calls
If the input to buble sort is given in already sorted order, buble sort will do n-1 comparisons and finish. Why does this not break the sorting lower bound?
The lower bound is for the worst case performance of sorting algorithms.
Median of medians algorithm
The median-of-medians algorithm is a deterministic linear-time selection algorithm. The algorithm works by dividing a list into sublists and then determines the approximate median in each of the sublists. Then, it takes those medians and puts them into a list and finds the median of that list ideally uses groups of size 5
counting sort runtime
The time complexity of counting sort algorithm is O(n+k) where n is the number of elements in the array and k is the range of the elements. Counting sort is most efficient if the range of input values is not greater than the number of values to be sorted.
radix sort run time
The time complexity of radix sort is given by the formula,T(n) = O(d*(n+b)), where d is the number of digits in the given list, n is the number of elements in the list, and b is the base or bucket size used, which is normally base 10 for decimal representation
Quick Sort Algorithm
This uses a divide and conquer algorithm. First the pivot value which is the first item in the list is selected. Then the remainder of the list is divided into two partitions, the elements less than the pivot is in the first partition and the greater elements in the second.
Reminder: In a 2-4 Tree, a 2 node where each of it's children are 2 nodes will compress into a 4 node when you delete. Is this true?
True
true or false? all leaves are the same depth in a 2-4 tree
True
Which of the following characteristics are typical of divide and conquer solutions?
Use recurrence relations to analyze running times. Uses recursive solutions to subproblems. Sometimes require work to combine solutions of smaller problems.
Inserting 1, 2, 3, 4... into a 2-4 tree creates?
a balanced tree
Red-Black Tree -> 2-4 Tree conversion
a black node with two black children is a 2 node, a black node with one red child is a 3 node, and a black node with two red children is a 4 node
Binary Search
a search algorithm that starts at the middle of a sorted set of numbers and removes half of the data; this process repeats until the desired value is found or all elements have been eliminated. so compare val to middle, if < then take left array, if right take right array
Red-Black tree
a self-balancing binary tree in which nodes are "colored" red or black. The longest path from the root to a leaf is no more than twice the length of the shortest path.
How many bits does it take to represent ? (I promise this is related to sorting)
at most (lg(n+1))/2
2-4 tree top-down insertion
automatically break up every 4 node you enter
2-4 Trees
consists of 2, 3, 4 nodes where it's a self balancing binary tree that prevents worst case run times of O(N)- think if you just keep insert bigger values so it's just a chain on the right side
Given a list of N integers in the range 0...N2, suppose we consider the values as having two "digits" (not base 10 digits :)) and make two passes of counting sort using the low order digit first. What is the time taken by each pass of counting sort O([A]) and what is the total time for the sort O([B])?
each pass: O(N) total time: O(N) IDK why this is true
2-4 tree top-down deletion
idk
Describe an efficient algorithm to find the K smallest values from an unordered list of N values. Give the running time in terms of N and K. You may invoke any algorithm presented in class without repeating a step by step description of the algorithm.
idk
2-4 Tree Run Times
insertion, deleting and searching of this tree is O(lgn)
Post-Order Traversal
left, right, root
In-Order Traversal
left, root, right
for (i=1; i<=100*n; i++) z++;
o(N)
for (k=1; k<=n; k+=2) z++;
o(N)
for (i=1; i<=n; i++) for (j=n; j>i; j--) z++;
o(N^2)
for (i=1; i<=n*n*n*n; i+=n) z++;
o(N^3)
for (j=1; j<=n*n; j++) for (k=1; k<=n; k++) p++;
o(N^3)
for (k=1; k<=n; k+=k) z++;
o(lg N)
What color is a node when it gets inserted
red nodes
Red/Black Tree deletion
replace with predecessor/successor depending on method asked for and then rebalance
Pre-Order Traversal
root, left, right.
Red Black Tree Time Complexities
search, insert, remove O(logN) space complexity O(N)
True/false Let:f(n) be g(n) be f(n) is O(g(n)) and g(n) is Ω(f(n))
true
When do we resizie an array?
when it's full on insert double it, when it has at or less than 25% then you half the size
Predecessor Replacement
when you extract the root from the binary search tree you replace it with it's predecessor which is one previous to it. O(lg N) - run time as you have to search for predecessor
Successor Replacement
when you extract the root from the binary search tree you replace it with it's successor which is one previous to it. O(lg N) - run time as you have to search for successor
int max(int items[], int size) { int largest = items[0]; for(int i=0; i<size; i++) if (items[i] > largest) largest = items[i]; return largest; } •What is the running time of the max function if items has N values?
•2N + 2 ? •O(N)