Week 6

Réussis tes devoirs et examens dès maintenant avec Quizwiz!

Which of these is true? Double hashing... prevents collisions by keeping two tables produces a second index in [0, table.length) is less vulnerable to clustering than linear or quadratic probing avoids collisions by computing a hash code that's a double instead of an int involves hashing the hash code

is less vulnerable to clustering than linear or quadratic probing With double hashing, a second hash function is used to determine the step size for the probing sequence. With linear or quadratic probing, elements that collide also tend to have the same step size, which leads to clustering. A secondary hash breaks up this uniformity.

Suppose we use a hash function h to hash distinct n keys into an array T of length m. Assuming simple uniform hashing -- that is, with each key mapped independently and uniformly to a random bucket -- what is the expected number of keys that get mapped to the first bucket? 1 / m m / n 1 / n n / m

n / m With n keys distributed amongst m buckets, the expected number of keys in each bucket is n / m.

Using linear probing, and the following hash function and data to be inserted in the given order, in which array slot will number 31 will be inserted? h(x) = x mod 13 18, 41, 22, 44, 59, 32, 31, 73 8 6 5 10

10 18 goes into slot 5, 41 into slot 2, 22 into slot 9, 44 into slot 6 (slot 5 is full), 59 into slot 7, 32 into slot 8 (slots 6 and 7 are full), and 31 into slot 10 (slots 5, 6, 7, 8 and 9 are full).

What is the maximum number of elements that a properly implemented binary search will need to compare a value against in order to determine its position in a sorted list of 1,000 elements? 9999 5 10 34

10 The maximum number of comparisons that binary search performs is log to the base 2 of the size of its input, rounded up. In this case 2**10 = 1024, and so it needs at most 10 comparisons.

Suppose you try to perform a binary search on the unsorted array [1, 4, 3, 7, 15, 9, 24]. How many of the items in this array will be found if they are searched for? 4 0 6 3 5

5 7, 4, 9, 24, and 1 will be found if searched for. 15 and 3 will not, since they lie to the wrong side of a subrange's midpoint.

Which of the following statements about binary search is FALSE? Binary search is inefficient when performed on a linked list. It relies on being able to efficiently access elements by their index. The data must be sorted. Binary search takes O(N) time in the worst case.

Binary search takes O(N) time in the worst case. Binary search takes O(log N) time even in the worst case.

If a hashtable's array is resized to reduce collisions, what must be done to the elements that have already been inserted? The existing items can be placed anywhere in the new array. The hashCode method must be updated. All items must be copied over to the same indices in the new array. Items must be rehashed and reinserted in the new array.

Items must be rehashed and reinserted in the new array. Since calculating a node's position in the hashtable is a function of the node's key's hashcode and the array size, all items must be reinserted if the array size changes.

What is the worst-case runtime complexity of removing an element from a hashtable of N elements that uses chaining? O(N) O(log N) O(1) O(N**2)

O(N) In the worst case, all the elements hash to the same entry in the hashtable, where they form a chain of N elements, and the element we want to remove is at the very end of the chain, so it takes O(N) time to reach and delete it.

In order to maintain an array in sorted order when adding an element, we can first use binary search to locate the correct position, and then insert the new element at that position. What is the worst-case runtime complexity of this algorithm? O(N) O(N**2) O(N log N) O(log N)

O(N) To locate the position will take O(log N), but to insert an element into an array we will have to shift existing elements out of the way. Since this has to shift N elements in the worst case, the insertion step is O(N) and so is the overall complexity.

What terminates a failed linear probe in a full hashtable? Revisiting the original hash index A null (empty or available) entry The end of the array An entry with a non-matching key

Revisiting the original hash index A null entry will not appear in a full hashtable. Seeing the end of the array isn't correct, since we need to examine all elements, including those that appear before our original hash index. An entry with a non-matching key is what started our probe in the first place. Revisiting the original hash index means we've looked at every entry and determined the item doesn't appear in the table.


Ensembles d'études connexes

Chapter 12 - Offers and Acceptances

View Set

Chapter 10 Laws Governing Access to Foreign Markets

View Set

APCSP mid term exam 기출문제

View Set

Abeka American Literature Appendix Quiz HH

View Set

Anticonvulsant Therapy Ch 14 Pharm

View Set

Physics Practice Questions - Chapter 15

View Set