ALGORITHMS ugh

Ace your homework & exams now with Quizwiz!

What is the effect of calling MAX-HEAPIFY(A, i) when: The element A[i] is larger than its children? i > heap-size[A]/2?

Both nothing happens

Write a recursive algorithm that computes and returns the total number of internal nodes of a red-black tree.

cn(x) if x == NIL return 0 return 1 + cn(x.left) + cn(x.right)

Red-black tree rules

1. Every node is either red or black 2. The root is black 3. Every leaf (NIL) is black 4. If a node is red, both its children are black 5. For each node, all simple path from the node to descendent leaves contain the same number of black nodes

Give an example of an array of size 7, containing the number from 1 to 7, which is not sorted or reversely sorted, and which will result in a worse-case running time for QUICKSORT.

7 1 2 3 4 5 6

Describe simulaties/differences between the divide and conquer and the dynamic programming paradigms.

D+C: break a small problem into smaller subproblems - scaled down version of original problem. Solve smaller for bigger. Dynamic: an exhaustive search of the solution space - calculate smaller solutions and save in table and look them up when recall is computationally cheaper than recomputation.

Briefly describe the applicability of dynamic programming and its two main properties.

Dynamic programming use is cacheing solutions to subproblems. Two main properties include an optimal substructure and overlapping subproblems.

Give a brief argument that the running time of partition on a subarray of size n is O(n)

Every for and while loop increases running factor of n or log(n) and partition only has one for loop. In partition the for loop is executed r-p times which is length of subarray. This means it ran once through loop: r-p-1=n-1 thus O(n)

CHOSE ALG: list has 45000 records but starts only slightly out of order

INSERTION works best when only slightly out of order

CHOSE ALG: list has 25000 records, must sort quickly as possible in all cases

MERGESORT guaranteed O(nlogn)

Which sort algorithm to sort a large number of keys, in guaranteed running time, assuming no memory restrictions.

No memory restriction best would be mergesort with guaranteed O(nlogn)

The depths of nodes in a red-black tree can be efficiently maintained as fields in the nodes of the tree.

No, because the depth of a node depends on the depth of its parent. When the depth of a node changes, the depths of all nodes below it in the tree must be updated. Updating the root node causes n - 1 other nodes to be updated

Can the min-heap property be used to print out the keys of an n-node heap in sorted order in O(n) time?

No, it doesn't tell which subtree of a node contains the element to print before that node In a heap, the largest element smaller than the node could be in either subtree

CHOSE ALG: list has 45000 records, must be sorted quickly, barely enough memory

QUICKSORT O(nlogn), partitions already allocated in memory for list

CHOSE ALG: list has several hundred records, records quite long but kets short

SELECTION n^2 comparisons, n exchanges

Given an element x is an n-node order statistic tree and a natural number i, how can the ith successor of x in the linear order of the tree be determined in O(logn) time

Since OS-RANK (returns rank of x) and OS-SELECT (returns position of x) both take O(logn) time, so does OS-SUCCESSOR... OS-SUCCESSOR(T, x, i) r <- OS-RANK(T, x) s <- r+1 return OS-SELECT(root[T], s, i)

Masters Method

T(n) = aT(n/b) + f(n), compare f(n) with nlogba, if O T(n) = Oequal(nlogba), if Oequal T(n) = Oequal(nlogbalgn), if greater than T(n) = Oequal(f(n))

What is the difference between the MAX-HEAP property and the binary search tree property?

The MAX-HEAP property states that a node in the heap is greater than or equal to both of its children the binary search property states that a node in a tree is greater than or equal to the nodes in its left subtree and smaller than or equal to the nodes in its right subtree

Briefly describe the general strategy of a divide and conquer algorithm.

The algorithm works by recursively breaking down a problem into two or more subproblems of same or related type until simple enough to be solved directly.

Assuming the elements in a max-heap are distinct, what are the possible locations of the second-largest element?

The second largest element has to be a child of the root

Suppose we use RANDOMIZED-SELECT to select the minimum element of the array A=... Describe a sequence of partitions that results in a worst case scenario of RANDOMIZED-SELECT

The worst case is selecting all numbers before 0, traversing the entire array.

Can we maintain the black-heights of nodes in a red-black tree as attributes in the nodes of the tree without affecting the asymptotic performance of any of the red-black tree operation? Show how, argue why or why not.

Yes due to the fact the hb(t) of a node can be computed from the nodes information and the children of it. Also, insertion and deletion can still be performed in O(logn) time.

Presently we can solve instances of size 100 in 1 minutes using algorithm A, 2^n. We will soon need to solve instances twice this size in 1 minute. Do you think this is possible to do this by buying a faster and more expensive computer?

Yes, use (algorithm speed) / (computer speed) = seconds

Write a recursive algorithm that computes and returns the total number of black nodes in a red-black tree.

blackNodes(x) if(x == null) return 1 if(x.black == true) return(blackNodes(x->left) + blackNodes(x->right) + 1) else return (blackNodes(x->left) + blackNodes(x->right)

Both counting sort and merge sort are comparison-based sorting algorithms

false

Mergesort is a sort in place algorithm

false

Selection is the problem of finding the rank of an element

false

The RANDOMIZED_PARTITION algorithm always returns the median element of an array.

false

The best case running time of Insertion sort is O(nlogn)

false

The ranks of nodes in a red-black tree can be maintained as fields in the nodes of the tree without affecting the asymptotic performance of any red-black tree operations.

false

What is the lowest possible bound on comparison-based sorting algorithms?

nlogn

(n-1)! IS o(n!)

true

In a dynamic solution the subproblems used in computing the optimal value are overlapping.

true

Selection is the problem of finding the ith order statistic of an element

true

The O notation is equivalent with an asymptotic "less than or equal to"

true

The number of red-nodes in a red-black tree can be maintained as fields in the nodes of the trees without affecting the asymptotic performance

true

We can simultaneously find the minimum and maximum numbers in any array using at most 3n/2 comparisons

true


Related study sets

Патан - Теория - Частная - ССС и эндокринная система

View Set

The Catcher in the Rye Sparknotes Quiz

View Set

Weather Proverbs/ Farmer's Almanac,Abdul

View Set

What is this? مَا هَذَا؟ مَا هَذِهِ؟

View Set