Data Structures & Algorithms with Java

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

What is the complexity of Heap Sort?

- Combining the original heap build that is O(N), and the sorting loop, we can see that the heap sort requires O(N log2N) comparisons. - Unlike the quick sort, the heap sort's efficiency is not affected by the initial order of the elements. - Even in the worst case it is O(N log2N). A heap sort is just as efficient in terms of space; only one array is used to store the data. - The heap sort requires only constant extra space.

What are the orders of the Tree Traversals?

1. Preorder: root, left, right 2. Inorder: left, root, right (prints ascending order) 3. Postorder: left, right, root

What are the maximum # of nodes at any level N for a Binary Tree

2^N For a binary tree the maximum number of nodes at any level N is 2^N

What is a Graph?

A graph G = (V, E)

Which sorting algorithms are stable?

Some Sorting Algorithms are stable by nature, such as Bubble Sort, Insertion Sort, Merge Sort, Count Sort etc.

What is a good recursively method to get the size of a BST?

private int recSize(BSTNode<T> node) { if (node == null) return 0; else return 1 + recSize(node.getLeft()) + recSize(node.getRight());}

Which is not a stable sort in its typical implementation. Insertion Sort Merge Sort Quick Sort Bubble Sort

Quick Sort

Which sorts are unstable?

Quick Sort, Heap Sort

Describe a complete graph?

A complete graph is one in which every vertex is adjacent to every other vertex. If there are N vertices, there are N * (N - 1) edges in a complete directed graph and N * (N - 1) / 2 edges in a complete undirected graph.

Selection sort algorithm design technique is an example of?

Greedy method

Can you do better than NLogN?

No, it has been proven theoretically that we cannot do better than (N log2N) for sorting algorithms that are based on comparing keys—that is, on pairwise comparison of elements.

For an array already sorted in ascending order what are time complexities of quick, bubble, merge and insertion?

Quicksort has a complexity Θ(n2) [Worst Case] Bubblesort has a complexity Θ(n) [Best Case] Mergesort has a complexity Θ(n log n) [Any Case] Insertsort has a complexity Θ(n) [Best Case]

Randomized quicksort is an extension of quicksort where the pivot is chosen randomly. What is the worst case complexity of sorting n numbers using randomized quicksort?

Randomized quicksort has expected time complexity as O(nLogn), but worst case time complexity remains same. In worst case the randomized function can pick the index of corner element every time.

What are the instructions of a Heap Sort?

1. Take the root (maximum) element off the heap, and put it into its place. 2. Reheap the remaining elements. (This puts the next-largest element into the root position.) 3. Repeat until there are no more elements. - The first part of this algorithm sounds a lot like the selection sort. - What makes the heap sort rapid is the second step: finding the next-largest element. - Because the shape property of heaps guarantees a binary tree of minimum height --> - we make only O(log2N) comparisons in each iteration, - as compared with O(N) comparisons in each iteration of the selection sort.

Which NLogN sorts require more space?

1. mergeSort and quickSort required more than constant extra space

Consider an array of elements arr[5]= {5,4,3,2,1} , what are the steps of insertions done while doing insertion sort in the array.

4 5 3 2 1 3 4 5 2 1 2 3 4 5 1 1 2 3 4 5 In the insertion sort , just imagine that the first element is already sorted and all the right side Elements are unsorted, we need to insert all elements one by one from left to right in the sorted Array. Sorted : 5 unsorted : 4 3 2 1 Insert all elements less than 5 on the left (Considering 5 as the key ) Now key value is 4 and array will look like this Sorted : 4 5 unsorted : 3 2 1 Similarly for all the cases the key will always be the newly inserted value and all the values will be compared to that key and inserted in to proper position.

What is the time complexity of building a Heap in order to implement Heap Sort?

During this heap construction the furthest any node moves is equal to its distance from a leaf. The sum of these distances in a complete tree is O(N), so building a heap in this manner is an O(N) operation

Which of the following sorting algorithms in its typical implementation gives best performance when applied on an array which is sorted or almost sorted (maximum 1 or two elements are misplaced).

Insertion sort takes linear time when input array is sorted or almost sorted (maximum 1 or 2 elements are misplaced). All other sorting algorithms mentioned above will take more than linear time in their typical implementation.

In a modified merge sort, the input array is splitted at a position one-third of the length(N) of the array. Which of the following is the tightest upper bound on time complexity of this modified Merge Sort.

The time complexity is given by: T(N) = T(N/3) + T(2N/3) + N Solving the above recurrence relation gives, T(N) = N(logN base 3/2)

What is the worst time complexity for a BST?

Time Complexity: The worst-case time complexity of search and insert operations is O(h) where h is the height of the Binary Search Tree. In the worst case, we may have to travel from root to the deepest leaf node. The height of a skewed tree may become n and the time complexity of search and insert operation may become O(n).

What is the special property of a BST

To support O(log2N) searching, the binary search property, based on the relationship among the values of its elements. We put all of the nodes with values smaller than or equal to the value in the root into its left subtree, and all of the nodes with values larger than the value in the root into its right subtree.

How many levels does a complete binary tree with N nodes?

⌊log2N⌋ levels

Given an unsorted array. The array has this property that every element in array is at most k distance from its position in sorted array where k is a positive integer smaller than size of array. Which sorting algorithm can be easily modified for sorting this array and what is the obtainable time complexity?

1) to sort the array firstly create a min-heap with first k+1 elements and a separate array as resultant array. 2) because elements are at most k distance apart from original position so, it is guranteed that the smallest element will be in this K+1 elements. 3) remove the smallest element from the min-heap(extract min) and put it in the result array. 4) Now,insert another element from the unsorted array into the mean-heap, now,the second smallest element will be in this, perform extract min and continue this process until no more elements are in the unsorted array.finally, use simple heap sort for the remaining elements Time Complexity ------------------------ 1) O(k) to build the initial min-heap 2) O((n-k)logk) for remaining elements... 3) 0(1) for extract min so overall O(k) + O((n-k)logk) + 0(1) = O(nlogk)

HashMap: Given instructions on linear probing

1. First use the hash function to computer the home slot. 2. If there is a collsion, then just step to the right by one step at a time until an empty slot is found. 3. If we reach the end of the array, then cycle around to the beginning.

What is the Heap Sort Analysis

1. It loops thru N-1 times, swapping elements and reheaping 2. A complete BT with N nodes has O(log(N+1)) levels 3. In worst cases, the root element had to be bumped to a LEAF position, then reheapDown would make O(LogN) comparisons 4. Multiplying this activity by the N-1 iterations shows that the sorting loop is O(NLogN)

Postfix notation is a notation for writing arithmetic expressions in which the operators appear after their operands. Example: (2 + 14) × 23 ----> 2 14 + 23 × More examples: 1. postfix: 4 5 7 2 + - × infix: 4 × (5 - (7 + 2))-16 = -16 2. postfix: 3 4 + 2 × 7 / infix: ((3 + 4) × 2)/7 = 2 3. postfix: 5 7 + 6 2 - × infix: (5 + 7) × (6 - 2). = 48 What is the postfix application method implementation and what data structure do we use?

1. while more items exist 2. Get an item 3. if item is an operand 4. stack.push(item) 5. else operand2 = stack.top() 6. stack.pop() 7. operand1 = stack.top() 8. stack.pop() 9. Set result to (apply operation corresponding to item to operand1 and operand2) stack.push(result) 10. result = stack.top() 11. stack.pop() 12. return result

Consider the array A[]= {6,4,8,1,3} apply the insertion sort to sort the array . Consider the cost associated with each sort is 25 rupees , what is the total cost of the insertion sort when element 1 reaches the first position of the array ?

50 When the element 1 reaches the first position of the array two comparisons are only required hence 25 * 2= 50 rupees. *step 1: 4 6 8 1 3 . *step 2: 1 4 6 8 3.

Which of the following is true about merge sort?

Merge Sort works better than quick sort if data is accessed from slow sequential memory. Merge Sort is stable sort by nature Merge sort outperforms heap sort in most of the practical situations.

Is BST more space efficient then Linked List?

No. The binary search tree, with its extra reference in each node, takes up more memory space than a singly linked list

Which of the below given sorting techniques has highest best-case runtime complexity.

Quick sort best case time Ο(n logn) Selection sort best case time Ο(n^2 ) Insertion sort best case time Ο(n) Bubble sort best case time Ο(n)

What is Bubble Sort at its best?

The bubble sort is at its best if the input data is sorted. i.e. If the input data is sorted in the same order as expected output. This can be achieved by using one boolean variable. The boolean variable is used to check whether the values are swapped at least once in the inner loop. - Consider the following code snippet: int main() { int arr[] = {10, 20, 30, 40, 50}, i, j, isSwapped; int n = sizeof(arr) / sizeof(*arr); isSwapped = 1; for(i = 0; i < n - 1 && isSwapped; ++i) { isSwapped = 0; for(j = 0; j < n - i - 1; ++j) if (arr[j] > arr[j + 1]) { swap(&arr[j], &arr[j + 1]); isSwapped = 1; } } for(i = 0; i < n; ++i) printf("%d ", arr[i]); return 0; } [/sourcecode] Please observe that in the above code, the outer loop runs only once.

If one uses straight two-way merge sort algorithm to sort the following elements in ascending order (below) then the order of these elements after the second pass of the algorithm is? 20, 47, 15, 8, 9, 4, 40, 30, 12, 17

8, 15, 20, 47, 4, 9, 30, 40, 12, 17 In first pass the elements are sorted in n/4 (first 2 elements in each group) sub arrays but in second pass the elements are sorted in n/2 (first 4 elements in each group) sub arrays.

What is time complexity of BFS and DFS?

- O(V + E) when Adjacency List is used - O(V^2) when Adjacency Matrix is used

Where is the root node and where is the first nonleaf node stored in our heap?

- Root: We know where the root node is stored in our array representation of heaps: values[0]. - The non-leaf node: From our knowledge of the array-based representation of a complete binary tree we know the first nonleaf node is found at position SIZE/2 - 1.

What are the possibilities of removal in BST?

1) Node to be deleted is leaf: Simply remove from the tree. 2) Node to be deleted has only one child: Copy the child to the node and delete the child 3) Node to be deleted has two children: Find inorder successor of the node. Copy contents of the inorder successor to the node and delete the inorder successor. Note that inorder predecessor can also be used. See images here: https://www.geeksforgeeks.org/binary-search-tree-set-2-delete/

How is insert() performed on a BST?

A new key is always inserted at the leaf. We start searching a key from the root until we hit a leaf node. Once a leaf node is found, the new node is added as a child of the leaf node. 100 100 / \ Insert 40 / \ 20 500 ---> 20 500 / \ / \ 10 30 10 30 \ 40

What is the worst case time complexity of insertion sort where position of the data to be inserted is calculated using binary search?

Applying binary search to calculate the position of the data to be inserted doesn't reduce the time complexity of insertion sort. This is because insertion of a data at an appropriate position involves two steps: 1. Calculate the position. 2. Shift the data from the position calculated in step #1 one step right to create a gap where the data will be inserted. Using binary search reduces the time complexity in step #1 from O(N) to O(logN). But, the time complexity in step #2 still remains O(N). So, overall complexity remains O(N^2).

What are the time complexities of the following algorithms: 1. selectionSort 2. bubbleSort 3. insertionSort 4. mergeSort 5. quickSort 6. heapSort

Best/ Average/ Worst: 1. selectionSort - O(N^2) all three 2. bubbleSort - O(N^2) all three - O(N) Best (Short Bubble) 3. insertionSort - O(N)/ O(N^2)/ O(N^2) 4. mergeSort - O(NLogN) all three 5. quickSort - O(NLogN)/Worst:O(N^2) 6. heapSort - O(NLogN) all three

You have to sort a list L, consisting of a sorted list followed by a few 'random' elements. Which of the following sorting method would be most suitable for such a task?

For a sorted list with few random numbers:Bubble sort will take O(n2) time in best case.Selection sort will take O(n2) time in best case.Quick sort will take O(n2) time in this case because it is the worst case for quick sort.Insertion sort will take O(n)So, option (D) is correct.

What is the similarity between preorder traversal, DFS and BST?

For example, the preorder traversal (which is identical to depth-first order) can be used to duplicate the search tree traversing a binary search tree in preorder and adding the visited elements to a new binary search tree as you go, will recreate the tree in the same exact shape.

How many times must we perform the merge operation? And what are the sizes of the subarrays involved?

Here we work from the bottom up. The original array of size N is eventually split into N subarrays of size 1. - Merging two of those subarrays into a subarray of size 2 requires 1 + 1 = 2 steps, based on the analysis of the preceding paragraph. - We must perform this merge operation a total of ½N times (we have N one-element subarrays and we are merging them two at a time). - Thus, the total number of steps to create all of the sorted two-element subarrays is O(N) because (2 * ½N = N).

What is the worst case time complexity of search in a BST?

If the 1,000 nodes were arranged in a binary search tree of minimum height, it takes no more than 10 comparisons— log2 1000 + 1 = 10—no matter which node we were seeking. For context, in the worst case—searching for the last node in a linear linked list—we must look at every node in the list; on average, we must search half of the list. If the list contains 1,000 nodes, it takes 1,000 comparisons to find the last node.

The height of a tree is the critical factor in determining the efficiency of searching for elements. If we begin searching at the root node and follow the references from one node to the next, accessing the node with the value J (the farthest from the root) is an O(N) operation. What is the time complexity for a tree of minimum height?

If the tree is of minimum height, its structure supports O(log2N) access to any element. This is because, for example, given the minimum-height tree to access the node containing J, we have to look at only three other nodes—the ones containing E, A, and G—before finding J.

It is easy to see that the maximum number of levels in a binary tree with N nodes is N (counting level 0 as one of the levels). But what is the minimum number of levels?

If we fill the tree by giving every node in each level two children until running out of nodes, the tree has (log2N) + 1 levels

Suppose we are sorting an array of eight integers using heapsort, and we have just finished some heapify (either maxheapify or minheapify) operations. The array now looks like this: 16 14 15 10 12 27 28 How many heapify operations have been performed on root of heap?

In Heapsort, we first build a heap, then we do following operations till the heap size becomes 1. a) Swap the root with last element b) Call heapify for root c) reduce the heap size by 1. In this question, it is given that heapify has been called few times and we see that last two elements in given array are the 2 maximum elements in array. So situation is clear, it is maxheapify whic has been called 2 times.

Can N^2 run faster than NLogN?

In comparing order of growth evaluations, we ignored constants and smaller-order terms because we want to know how the algorithm performs for large values of N. - In general, an O(N2) sort requires few extra activities in addition to the comparisons, so its constant of proportionality is fairly small. - Conversely, an O(N log2N) sort may be more complex, with more overhead and thus a larger constant of proportionality. - This situation may cause anomalies in the relative performances of the algorithms when the value of N is small. - In this case, N2 is not much greater than N log2N, and the constants may dominate instead, causing an O(N2) sort to run faster than an O(N log2N) sort. - A programmer can leverage this fact to improve the running time of sort code, by having the code switch between an O(N log2N) sort for large portions of the array and an O(N2) sort for small portions.

When are ordering duplicates in an array important?

In our descriptions of the various sorts, we showed examples of sorting arrays of integers. Stability is not important when sorting primitive types. - If we sort objects, however, the stability of a sorting algorithm can become more important. - We may want to preserve the original order of unique objects considered identical by the comparison operation.

The worst case running times of Insertion sort, Merge sort and Quick sort, respectively, are:

Insertion Sort takes Θ(n2) in worst case as we need to run two loops. The outer loop is needed to one by one pick an element to be inserted at right position. Inner loop is used for two things, to find position of the element to be inserted and moving all sorted greater elements one position ahead. Therefore the worst case recursive formula is T(n) = T(n-1) + Θ(n). Merge Sort takes Θ(n Log n) time in all cases. We always divide array in two halves, sort the two halves and merge them. The recursive formula is T(n) = 2T(n/2) + Θ(n). QuickSort takes Θ(n2) in worst case. In QuickSort, we take an element as pivot and partition the array around it. In worst case, the picked element is always a corner element and recursive formula becomes T(n) = T(n-1) + Θ(n). An example scenario when worst case happens is, arrays is sorted and our code always picks a corner element as pivot.

Which of the following statements is correct with respect to insertion sort ? *Online - can sort a list at runtime *Stable - doesn't change the relative order of elements with equal keys.

Insertion sort is stable, online but not suited well for large number of elements.

A sorting technique is called stable if:

It maintains the relative order of occurrence of non-distinct elements

You have to sort 1 GB of data with only 100 MB of available main memory. Which sorting technique will be most appropriate?

Merge Sort

What is the best sorting algorithm to use for the elements in array are more than 1 million in general?

Most practical implementations of Quick Sort use randomized version. The randomized version has expected time complexity of O(nLogn). The worst case is possible in randomized version also, but worst case doesn't occur for a particular pattern (like sorted array) and randomized Quick Sort works well in practice. Quick Sort is also a cache friendly sorting algorithm as it has good locality of reference when used for arrays. Quick Sort is also tail recursive, therefore tail call optimizations is done.

Is the recursive version much less efficient than the non-recursive version?

No. Both the recursive and the nonrecursive versions of size are O(N) operations. Both have to count every node.

Which NLogN sorts are unstable?

Of the various types of sorts that we have discussed in this text, only heapSort and quickSort are inherently unstable.

Consider a situation where swap operation is very costly. Which of the following sorting algorithms should be preferred so that the number of swap operations are minimized in general?

Selection sort makes O(n) swaps which is minimum among all sorting algorithms mentioned above.

Which one of the following in place sorting algorithms needs the minimum number of swaps?

Selection sort takes minimum number of swaps to sort an array. It takes maximum of O(n) comparisons to sort an array with n elements. Refer: Selection sort

What is a use case of postorder traversal?

Since postorder traversal starts at the leaves and works backwards toward the root, it can be used to delete the tree, node by node, without losing access to the rest of the tree while doing so—this is analogous to the way a tree surgeon brings down a tree branch by branch, starting way out at the leaves and working backwards toward the ground.

The average case and worst case complexities for Merge sort algorithm are

The best case, average case and worst case complexities for Merge sort algorithm are O( nlog2n ). So, option (D) is correct.

Which sorting algorithm will take least time when all elements of input array are identical? Consider typical implementations of sorting algorithms.

The insertion sort will take (n) time when input array is already sorted.

The tightest lower bound on the number of comparisons, in the worst case, for comparison-based sorting is of the order of?

The number of comparisons that a comparison sort algorithm requires increases in proportion to Nlog(N), where N is the number of elements to sort. This bound is asymptotically tight: Given a list of distinct numbers (we can assume this because this is a worst-case analysis), there are N factorial permutations exactly one of which is the list in sorted order. The sort algorithm must gain enough information from the comparisons to identify the correct permutations. If the algorithm always completes after at most f(N) steps, it cannot distinguish more than 2^f(N) cases because the keys are distinct and each comparison has only two possible outcomes. Therefore, 2^f(N) >= N! or equivalently f(N) >= log(N!). Since log(N!) is Omega(NlogN), the answer is NlogN. For more details, read here

A list of n string, each of length n, is sorted into lexicographic order using the merge-sort algorithm. The worst case running time of this computation is

The recurrence tree for merge sort will have height Log(n). And O(n^2) work will be done at each level of the recurrence tree (Each level involves n comparisons and a comparison takes O(n) time in worst case). So time complexity of this Merge Sort will be .

The auxiliary space of insertion sort is O(1), what does O(1) mean ?

The term O(1) states that the space required by the insertion sort is constant i.e., space required doesn't depend on input. It means the amount of extra memory Insertion Sort consumes doesn't depend on the input. The algorithm should use the same amount of memory for all inputs

What are some ways we can construct a BST? And does Inorder Traversal of a BST always produce sorted output?

We can construct a BST with only Preorder or Postorder or Level Order traversal. Note that we can always get inorder traversal by sorting the only given traversal. Yes, Inorder traversal of BST always produces sorted output.

Describe the first steps of the Heap Sort process?

We can easily access the largest element from the original heap—it is in the root node. In our array representation of heaps, the location of the largest element is values[0]. - This value belongs in the last-used array position values[SIZE - 1], so we can just swap the values in these two positions. - Because values[SIZE - 1] now contains the largest value in the array (its correct sorted value), we want to leave this position alone. - Now we are dealing with a set of elements, from values[0] through values[SIZE - 2], that is almost a heap.

Assume that we use Bubble Sort to sort n distinct elements in ascending order. When does the best case of Bubble Sort occur?

When elements are sorted in ascending order

What is BFS and BFS Graph time complexity?

Worst-case performance: O( |V|+ |E| ) Worst-case space complexity: O( |V| )

Is depth of tree recursion relatively shallow?

Yes. The depth of recursion depends on the height of the tree. If the tree is well balanced (relatively short and bushy, not tall and stringy), the depth of recursion is closer to O(log2N) than to O(N).

Which is the correct order of the following algorithms with respect to their time Complexity in the best case ?

insertion sort < Quick sort < Merge sort < selection sort Quick sort: O (nlogn) Merge sort: O (nlogn) Insertion sort: O (n) Selection sort: O (n^2)

What is a simple way to find the size of a subarray?

last - first + 1 or hi - lo + 1


Kaugnay na mga set ng pag-aaral

Science Unit 3: Chemistry, atoms, molecules; October 20, 2014

View Set

Mood and Affect Practice Questions

View Set

Which one of the following statement is True about the health continuum?

View Set