CS 61B Final

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

A ______ or _______ traversal on a min heap with distinct elements may output the elements in sorted order. Assume there are at least three elements in the min heap.

DFS preorder, BFS (level order) The smallest item of a min heap is at the top, so whatever traversal we choose must output the top element first in a complete binary tree. Only preorder and level-order have this property.

True or False. It's impossible for the MST of a graph to contain the largest weighted edge.

False Imagine a graph where all of the edges are the same weight. Alternatively, if you want unique edge weights, imagine a graph that is just a line of vertices.

True or False. The Shortest Paths Tree returned by Dijkstra's will never be a correct MST.

False In either of the two graphs below, imagine running Dijkstra's from A. The above graph would have the SPT of AB, AC which is a valid MST. The below graph would have the SPT AB, BC which is a valid MST.

T/F You use a tree set to sort a list with non-unique values?

False Sets don't have duplicates.

True or False: If a graph with n vertices has n-1 edges, it MUST be a tree

False There could be three vertices in a cycle and one verticie that is disconnected. This can only be true if all vertices are connected.

True/False: The shortest path from vertex u to vertex v in a graph G is the same as the shortest path from u to v using only edges in T, where T is the MST of G.

False, consider vertices C and E in the graph above (exam level 10 q3)

(T/F) Heapsort is stable.

False, stability for sorting algorithms mean that if two elements in the list are defined to be equal, then they will retain their relative ordering after the sort is complete. Heap operations may mess up the relative ordering of equal items and thus is not stable. As a concrete example, consider the max heap: 21 20a 20b 12 11 8 7

(T/F) For a weighted undirected graph: Adding a constant positive integer k to all edge weights will not affect any shortest path between two vertices.

False. Consider a case of 3 nodes A, B, and C where AB is 1, AC is 2.5 and BC is 1. Clearly, the best path from A to C is through B, with weight 2. However, if we add 1 to each edge weight, suddenly the path going through B will have a weight 4, while the direct path is only 3.5.

What is the primary reason we visit vertices in order of increasing distance from the source instead of decreasing distance?

Guarantees correctness.

Which sorts are asymptotically better than Quicksort's worst case?

Only merge sort and heapsort are guaranteed to be better than N^2

True or False. If you take any graph G with positive edge weights and square all the edge weights and turn it into the graph G′, G and G′ have all the same MST's

True For any positive number, if A > B then A^2 > B^2. Since the relative ordering of edge weights stays the same none of the MSTs will change.

True or False: Dijkstra's algorithm will correctly generate a Shortest Paths Tree for some graphs that have negative edge weights.

True It won't work for all graphs that have negative edge weights, but it will work for some. For example, if the graph is a tree, there's only one path to each node, so Djikstra's algorithm will still work.

True or False: Suppose we run Quicksort on a sorted array of distinct elements. Suppose our pivot selection is always the last element. Suppose we do not shuffle. Suppose we use Tony Hoare style partitioning. Given these suppositions, Quicksort will take N^2 time.

True Note that since we always choose the last element as the pivot, and the array is sorted, our pivot is always the largest value. Therefore, our "partition" operation just removes the last value, so we'll partition Θ(N) times. Since each partitioning takes Θ(N) time, our overall runtime will be Θ(N^2).

True or False. Prim's algorithm will work with negative edge weights.

True Prim's algorithm works by the cut property. To use it we only need to be able to find the smallest edge across a cut regardless of whether it is positive or negative. Another way to see this is that Prim's only ever considers the weight of a single edges; the algorithm never needs to worry about adding edge weights together. Thus, it can directly compare edge weights to find the smallest which does not depend on whether any weights are positive or negative!

What is selection sort's best case and worst case? Practice it on this array: [6, 1, 4, 7, 3, 2, 5, 8]

Worst: N^2, Best N^2

What is insertion sort's best case and worst case runtimes? Practice it on this array: [6, 1, 4, 7, 3, 2, 5, 8]

Worst: N^2, Best: N

What is merge sort's best case and worst case runtimes? Practice it on this array: [6, 1, 4, 7, 3, 2, 5, 8]

Worst: NlogN, Best: NlogN

If you pass in the same seed to Random, will it generate the same number sequence?

Yes

Is insertion sort stable?

Yes

Is merge sort stable?

Yes

Given a graph with all positive edges except one or more negative edges leaving the start node, does Djkstra's algorithm always fin the correct shortest paths tree?

Yes, it ALWAYS does.

Consider a variant of Dijkstra's algorithm DAV2 that tries to find a target nodeand stops as soon as the target node is dequeued. Does DAV2 always find a correct shortest path to the target on a graph with no negative edges?

Yes, this is what Dijkstra's already does.

Suppose we want to create a new class called MutationSafeHeapMinPQ. The only difference between this and the HeapMinPQ from lecture is that our new class's reporting methods (min and removeMin) must always return the correct result, even if previously added items in the priority queue are modified. In other words, it handles mutable objects perfectly fine, even if they change while in the PQ. Is this possible? How?

Yes, we just have to reheapify before each call to min and remove min.

Suppose we use heapsort as our subroutine for radix sort MSD, creating MSD-radix-heapsort. Will this yield correct results on all inputs?

Yes.

Given a min heap with 2^n-1 distinct elements, for an element to be on the second level it must be less than ______ element(s).

1

What 3 factors make a haschode valid?

1. Must be an integer 2. Hashcode must be the same for the same object. 3. If two objects are equal they must have the same hashcode

The fourth smallest element in a min-heap with 1000 distinct elements can appear in ______ places in the heap.

14, The 4th smallest item can be on the 2nd, 3rd, or 4th level of the heap.

Give a 5 integer array that elicits the worst case runtime for insertion sort.

A simple example is: 5 4 3 2 1. Any 5 integer array in descending order would work.

We want to sort an array of N unique numbers in ascending order. What are the best and worst runtimes of the following sort: We run an optimal sorting algorithm of our choosing knowing there are exactly N*(N-1)/2 inversions.

Best case: N, Worst case N If a list has N(N − 1)/2 inversions, it means it is sorted in descending order! So, it can be sorted in ascending order with a simple linear time pass. We know that reversing any array is a linear time operation, so the optimal runtime of any sorting algorithm is Θ(N).

We want to sort an array of N unique numbers in ascending order. What are the best and worst runtimes of the following sort: We run an optimal sorting algorithm of our choosing knowing that there are at most N inversions.

Best case: N, Worst case N Recall that insertion sort takes (N + K) time, where K is the number of inversions. Thus, the optimal sorting algorithm would be insertion sort. If K < N, then, insertion sort has the best and worst case runtime of Θ(N).

We want to sort an array of N unique numbers in ascending order. What are the best and worst runtimes of the following sort: We run an optimal sorting algorithm of our choosing knowing there is exactly one inversion.

Best case: N, Worst case N The inversion may be the first two elements, in which case constant time is needed. Or, it may involve elements at the end, in which case N time is needed. It can be proven quite simply that no sorting algorithm can achieve a better runtime than above for the best and worst case.

We want to sort an array of N unique numbers in ascending order. What are the best and worst runtimes of the following sort: Once the runs in merge sort are of size <= N/100, we perform insertion sort on them.

Best case: N, Worst case, N^2 Once we have 100 runs of size N/100, insertion sort will take best case Θ(N) and worst case Θ(N^2) time. Note that the number of merging operations is actually constant (in particular, it takes about 7 splits and merges to get to an array of size N/2^7 = N / 128).

What is heap sort's best case and worst case runtimes? Practice it on this array: [6, 1, 4, 7, 3, 2, 5, 8]

Best case: NlogN, Worst case N

We want to sort an array of N unique numbers in ascending order. What are the best and worst runtimes of the following sort: We use a linear time median-finding algorithm to select the pivot in quicksort.

Best case: NlogN, Worst case: NlogN Doing an extra N work each iteration of quicksort doesn't asymptotically change the best case runtime, since we have to do N work to partition the array. However, it improves the worst-case runtime, since we avoid the "bad" case where the pivot is on the extreme end(s) of the partition.

We want to sort an array of N unique numbers in ascending order. What are the best and worst runtimes of the following sort: We implement heapsort with a min heap instead of a max heap. You may modify heapsort but must maintain constant space complexity.

Best: NlogN, Worst: NlogN While a max-heap is better, we can make do with a min-heap by placing the smallest element at the right end of the list until the list is sorted in descending order. Once the list is in descending order, it can be sorted in ascending order with a simple linear time pass.

True or False: Finding and using the median element of every partition as the pivot will usually result in an empirically faster quicksort than a quicksort that uses a random pivot selection strategy.

FALSE Finding the median element is actually really hard. Ironically, the fastest way to find the median uses a modified version of quicksort.

(T/F) For a weighted undirected graph: If all edges have distinct weights, the shortest path between any two vertices are unique.

False Consider a case of 3 nodes where AB is 3, AC is 5, and BC is 2. Here, the two possible paths from A to C both are of length 5.

True or False. The minimum weight edge of any cycle in a graph G will be part of any MST of G

False Consider the first graph in the picture. In the purple cycle (ABC) the minimum weight is 2. However, the MST of this graph in red (SA, SB, SC) does not contain this edge. Alternatively, consider the second graph with identical edge weights. We can call any edge the minimum and pick the other two to make a valid MST.

True or False. A graph with non unique edge weights will always have a non unique MST

False Consider the graph below with duplicate edge weights and only one MST.

True or False: Heapsort is empirically just as fast as merge sort.

False Emperically, quicksort is usually faster than mergesort is usually faster than heapsort. The reasons are out of scope for this class, but have to do with spatial locality (quicksort and mergesort tend to access elements that are close together, while heapsort tends to access elements that are far apart). Interestingly, there are niche circumstances where mergesort can be much more efficient than quicksort, such as when the dataset is too large to fit into memory.

True or False: given any graph with distinct edge weights and any node in the graph, there is only one possible Shortest Paths Tree that you can generate.

False Even with distinct edge weights, there can be multiple shortest paths to a node, which means there can be multiple shortest paths trees. Give an example.

For the array [9, 1, 1, 3, 5, 5, 6, 8] what sort will be the fastest in nanoseconds? Why?

Insertion Sort, its a small array with minimum inversions

Given an array, suppose that x<y and that x appears to the right of y. What happens to the inversion count if we swap x and y?

It can decrease

What happens to the inversion count if we min heapify an array using bottom-up heapification?

It can decrease. It can also stay the same.

What is the primary reason that we implement a priority queue using a heap instead of an ordered array?

It improves runtime because we are able to dynamically shift around a heap while an ordered array needs a lot of manipulation in order to find a max

Why do we use double inks (next and prev) in every node instead of using only single links in a LinkListDeque implementation

It improves runtime because we are able to traverse forwards and backwards through an array.

What is the primary reason that we rotate after adding an item to a bst instead of not rotating?

It improves runtime.. We are trying to avoid a long spindly tree at all costs, rotating makes the tree more bushy.

What is the primary reason that shuffling before beginning quicksort is better than not shuffling?

It improves the runtime, and prevents the worst case N^2

What reasons would you use Merge sort> Quick Sort

Merge sort is NlogN, but quicksort is N^2 at worst. Mergesort is stable, while quicksort isnt. Mergesort is better for linked lists.

Suppose we have a connected, undirected graph G with N vertices and N edges, where all the edge weights are identical. Find the maximum and minimum number of MSTs in G and explain your reasoning. Minimum: _________ Maximum: _________

Minimum: 3, Maximum: N Justification: Notice that if all the edge weights are the same, an MST is just a spanning tree. Let's begin by creating a tree, i.e. a connected graph with N − 1 edges. Now, notice that there is only one spanning tree, since the graph is itself a tree. As such, the problem reduces to: how many spanning trees can the insertion of one edge create? If we add an edge to a tree, it will create a cycle that can be of length at minimum 3 and at maximum N. Then, notice that we can only remove any edge from a cycle to create a spanning tree, so we have at minimum 3 and at maximum N possible MSTs in G.

What are all sorting algos lower bounded by?

N

What is the runtime of a WQU constructor with compression?

N

What is the runtime of a Weighted Quick Union constructor without compression?

N

Given a min heap with 2^n-1 distinct elements, for an element to be on the bottom level it must be less than ______ element(s).

N-1 must be greater than the elements on its branch

Consider a variant of Dijkstra's algorithm DAV1 that tries to find a target node and stops as soon as the target node is enqueued. Does DAV1 always find the correct shortest path to the target on a graph with no negative edges?

NO! In Dijkstra's when we enqueue a node and its current distance from the root, we also have the possibility of later "relaxing an edge," which will update the node's distance to a shorter distance. So when we enqueue a node, the current distance may or may not already be the shortest distance.

Is heapsort stable?

No

Is quicksort stable?

No

Is selection sort stable?

No

Consider only graphs with non-negative weights: Suppose that we want to store the second shortest paths from a start vertex to every other vertex for which a second shortest path exists. Do the resulting edges form a tree? If yes explain why. If not, give a counter-example.

No consider a tree where A, B, and C are in a cycle, The second shortest path from A-> B is A-> C->B A→C is A→B→C. Together, A→C→B and A→B→C do not form a tree.

A student realizes the get operation of a hash table is constant so long that all the elements of the hash table are evenly distributed. They want to create an infinite sized hash set that would map any integer array to a sorted version of that array. If we could somehow have infinite memory, could the operation to get each hashed array take constant time?

No, because computing the hashcode takes n time

Suppose we use heapsort as our subroutine for radix sort LSD, creating LSD-radix-heapsort. Will this yield correct results on all inputs?

No, heapsort isn't stable.

One way to implement insertion sort as described in class is to call travel(0), travel(1), travel(2), ..., travel(N-1), in that order, where travel(i) is helper function where the chosen item swaps itself with its left neighbor repeatedly as long as its left neighbor is greater than it. In other words, each item is the traveler exactly once, and traveling means heading as close to the front as possible. Suppose that we instead call travel in the reverse order, e.g. travel(N-1), travel(N-2), ..., travel(0). Does this also work? Assume that the travel operation is exactly as above. If yes, explain why. If no, give a counter-example.

No. Counter example: 3, 0, 1, 2 Here, 2 gets stuck behind the 1

Which sorts are bounded by omega NlogN

Quicksort, Mergesort, and Selection Sort (Insertion sort could virtually run at N if there are almost no inversions, and if there are almost all sorted items, there is no bubbling up or down, so heapsort can become N)

Which sorting algorithms never compare the same two elements twice?

Quicksort: It compares to the partition then it sorts its own half Mergesort Insertion sort

We have a system running insertion sort and we find that it's completing faster than expected. What could we conclude about the input to the sorting algorithm?

The array is nearly sorted. Note that insertion sort has a best case runtime of Θ(N), which is when the array is already sorted.

What is the primary reason that we place the first item of a heap array at index 1 instead of index 0?

Simplifies the code, it makes it easier to access parents and children

Assume G is an undirected, connected graph with at least three vertices. If some edge weights are identical then can there multiple MSTs in G?

Sometimes Exam Level 12 Tries and Graphs

Assume G is an undirected, connected graph with at least three vertices. If all of the edge weights are identical, can there be multiple MSTs in G

Sometimes Exam Level 12 Tries and Graphs

True or False: The following sort is stable: We split an array up into two halves and run insertion sort on each half. Then we merge the halves together like we do in merge sort.

TRUE. Insertion sort is stable, and the merge operation is also stable (which is why mergesort is stable). This means that the entire algorithm is stable.

It is possible that Prim's and Kruskal's find different MSTs on the same graph G (as an added exercise, construct a graph where this is the case!). Given any graph G with integer edge weights, modify G ensure that Prim's and Kruskal's will always find the same MST. You may not modify Prim's or Kruskal's, and you may not add or remove any nodes/edges.

To ensure that Prim's and Kruskal's will always produce the same MST, notice that if G has unique edges, only one MST can exist, and Prim's and Kruskal's will always find that MST! So, what if we modify G to ensure that all the edge weights are unique? To achieve this, let's strategically add a small, unique offset between 0 and 1, exclusive, to each edge. It is important that we choose an offset between 0 and 1 so that this added value doesn't change the MST, since all the edge weights are integers. It is also important that the offset is unique for each edge, because then we ensure each weight is distinct.

T/F In a weighted directed graph with non-negative weights the smallest edge leaving a given vertex is always the SPT for that vertex

True

T/F In a weighted, undirected graph with non-negative weights, the smallest edge touching a given vertex is always in the MST.

True

True or False: multiplying every edge in a graph with positive edge weights by some positive constant k will not change the Shortest Paths Tree that Dijkstra's algorithm generates.

True If all the edge weights were multiplied by some positive k (for example, 1000), Djikstra's algorithm would still make all the same choices. For example, in the first question, we relaxed the edge from A to E because (5 + 1) < 9. If all the edge weights were multiplied by 1000, it would do the same thing because (5000 + 1000) < 9000.

True or False: Quicksort can be made stable using a partitioning scheme which involves 3 different arrays: one array for items less than the pivot, one array for items equal to the pivot, and one array for items greater than the pivot.

True Recall that stable means that if two elements are equal to each other, they shouldn't switch places in the array. Ordinary quicksort isn't stable because it's done in-place, which is more efficient but tends to invert the positions of equal elements. An implementation that uses external arrays (you could even do it with just two, you don't need three arrays) avoids this problem and can be made stable.

True or False: In BFS, let d(v) be the minimum number of edges between a vertex v and the start vertex. For any two vertices u, v in the fringe, |d(u) − d(v)| is always less than 2.

True Suppose this was not the case. Then, we could have a vertex 2 edges away and a vertex 4 edges away in the fringe at the same time. But, the only way to have a vertex 4 edges away is if a vertex 3 edges away was removed from the fringe. We see this could never occur because the vertex 2 edges away would be removed before the vertex 3 edges away!

True or False. A graph with unique edge weights will have exactly one MST. You might find it useful to know that Kruskal's algorithm can generate any MST depending on its tie-breaking scheme.

True The actual proofs for this question are above the level that we would expect a 61B student to understand. This is my best attempt to explain it "intuitively." Kruskal's will only need to break ties if there are non-unique edges. If all edges are unique then Kruskal's algorithm will only find a single MST. Since Kruskal's algorithm can find any MST, the one MST that Kruskal's finds is the only MST.

True or False: Every edge is looked at exactly twice in every iteration of DFS on a connected, undirected graph.

True The two vertices the edge is connecting will look at that edge when it's their turn.

True or False: for graphs that happen to be trees, it is possible to generate a Shortest Paths Tree faster than Dijkstra's algorithm would generate it.

True in O(V+E) time using BFS or DFS, while Djikstra's algorithm would take O(ElogV) time. Since this is a tree, E=V−1, so these runtimes simplify to O(V) and O(VlogV) respectively.

(T/F) For a weighted undirected graph: If all edge weights are equal and positive, the breadth-first search starting from node A will return the shortest path from node A to target node B.

True If all edges are equal in weight, then the shortest path from A to each node is proportional to the number of nodes on the path, so a breadth-first search will return the shortest path.

True/False: Adding 1 to the smallest edge of a graph G with unique edge weights must change the total weight of its MST

True, either this smallest edge (now with weight +1) is included, or this smallest edge is not included and some larger edge takes its place since there was no other edge of equal weight. Either way, the total weight increases.

True/False: If all the weights in an MST are unique, there is only one possible MST.

True, the cut property states that the minimum weight edge in a cut must be in the MST. Since all weights are unique, the minimum weight edge is always unique, so there is only one possible MST.

(T/F) For a weighted undirected graph: Multiplying a constant positive integer k to all edge weights will not affect any shortest path between two vertices.

True. Suppose we have arbitrary nodes u and v. Let's say the shortest path from u to v, before the multiplication by k, was of total weight w. This implies that every other path from u to v was of total weight greater than w. After multiplying each edge weight by k, the total weight of the shortest path becomes w ∗ k and the total weight of every other path becomes some number greater than w ∗ k. Therefore, the original shortest path doesn't change.

Imagine using quick sort as a subroutine of LSD radix sorting. What is the worst case runtime in terms of W (width) and N? Does this work correctly?

WN^2, Quicksort is stable, so it works correctly.

Imagine using mergesort as a subroutine of LSD radix sorting. What is the worst case runtime in terms of W (width) and N? Does this work correctly?

WNlogN, Merge sort is stable so this does work properly

Given an undirected graph, provide an algorithm that returns true if a cycle exists in the graph, and false otherwise. Also, provide a Θ bound for the worst case runtime of your algorithm.

We do a depth first search traversal through the graph. While we recurse, if we visit a node that we visited already, then we've found a cycle. Assuming integer labels, we can use something like a visited boolean array to keep track of the elements that we've seen, and while looking through a node's neighbors, if visited gives true, then that indicates a cycle. However, since the graph is undirected if an edge connects vertices u and v, then u is a neighbor of v, and v is a neighbor of u. As such, if we visit v after u, our algorithm will claim that there is a cycle since u is a visited neighbor of v. To address this case, when we visit the neighbors of v, we should ignore u. To implement this in code, one idea is to use a helper function that takes in the parent of this current dfs call. In the worst case, we have to explore at most V edges before finding a cycle (number of edges doesn't matter). So, this runs in Θ(V ).

What is the runtime of the connected and is connected methods of WQU without compression?

logN

What is the runtime of the connected and is connected methods of WQU with compression?

logN, but nearly constant

In a min heap with distinct elements, insert has a best case runtime of ____ and a worst case runtime of ____.

Θ(1), Θ(logN).

In a min heap with distinct elements, removeMin has a best case runtime of ____ and a worst case runtime of _____.

Θ(1), Θ(logN).

Constant Sequence: c + c + c + ... + c =

𝚯(N) 1 + 1 + 1 + 1 + ... + 1 = 𝚯(N)

Geometric Sequence: 1 + r + r^2 + ... + N =

𝚯(N) 1 + 2 + 4 + 8 + ... + N = 𝚯(N)

Arithmetic Sequences: 1 + (1+r) + (1+2r) + ... + (1+(N - 1)r) =

𝚯(N2) 1 + 2 + 3 + 4 + ... + N = 𝚯(N2)


Kaugnay na mga set ng pag-aaral

Chapter 2: The Accounting Cycle During the Period

View Set

12 AP Lang Terms (Exemplification-Understatement)

View Set

Origin, Insertion, action, innervation

View Set

Chapter 11 - Network Performance and Recovery

View Set