DSA Final Exam

¡Supera tus tareas y exámenes ahora con Quizwiz!

How do you restore property after adding in B Tree (2-3 tree) when the 3 node has a 2 node parent ?

- add key standardly - split overflowed node - merge new root into parent

How do you restore property after adding in B Tree (2-3 tree) when the 3 node has a 3 node parent ?

- add key standardly - split overflowed node - merge new root into parent - repeat 2-3 until no violation

How do you restore property after adding in B Tree (2-3 tree) when the 3 node has no parent (root) ?

- add key to the 3 node - turn the middle key into a 2 node root - turn other two keys into 2 node children

Complexity of Hash Table LP: space complexity for storing n elements

- no matter how many nodes, we always need to maintain a table size of size "h". - assume each object consumes M1 space - in total, consumes M1 * h = O(h) space * Space does not necessarily depend on n. However, if we further assume the smallest necessary table size (which is n), then h = O(n) and therefore space is O(h) = O(n).

Remove and restore B tree when leaf is a 2 node and a sibling is a 3 node

- remove - fill hole by parent key and fill parent hole by key in the 3 node sibling

Remove and restore B tree when leaf is a 2 node and no sibling is 3 node but parent is

- remove - merge one parent key and one sibling key into the whole

Complexity of Hash Table SC: time complexity for deletion

- search for node ... time O(m) - remove node = delete on linked list = O(1)

What are some common operations that trigger splaying in a Splay Tree?

1. Search -if key is found at node X, splay X 2. Add - if node is added, splay it 3. Delete (new deletion algorithm, unlike search and add) - to delete a node, splay it then delete, then splay its replacement node

How do you splay? (what are the scenarios, depending on node X, parent P, grandparent G)

1. X only has P (left child) but not G 2. X only has P(right child) but not G 3. X has both P(left child) and G (left child) 4. X has both P (right child) and G (right child) 5. X has both P (left child) and G (right child) 6. X has both P (right child) and G (left child)

How does breadth-first traverse work in a graph?

1. begin with any node X1 2. visit all neighbors of X1 (let X2 be the set of unvisited neighbors) 3. visit all unvisited neighbors of X2 (let X3 be the set of unvisited neighbors.) 4. Visit all unvisited neighbors of X3. ........ until the unvisited neighbor set is empty * similarly, breadth-first traverse is guaranteed to visit all nodes in an undirected graph as long as the graph is "connected". BFT on directed graph is similar except that - can only move along directed edges. - guaranteed to visit all nodes only on "strongly connected" graph.

How does depth-first traverse work in a graph?

1. begin with node X 2. pick unvisited neighbors of X (randomly) and visit; repeat until visited node has no unvisited neighbor 3. backtrack to the nearest node with unvisited neighbor and apply step 2; stop when back to node X and it has no unvisited neighbors * guaranteed to visit all nodes in an undirected graph as long as the graph is connected, dft on directed graph is similar except that we can only move along directed edges and we are guaranteed to visit all nodes only on strongly connected graph *

Complexity of AVL Tree: time complexity for adding (best case)

1. do regular node adding - best case search time O(h) = O(log_2n) - adding time O(1) - total time O(log_2n + 1) = O(log_2n) 2. check violation and restore if needed - best case means no violation, O(1) time check 3. Worst case total time is: - T = O(log_2 n) + O(1) = O(log_2 n).

Complexity of AVL Tree: time complexity for adding (worst case)

1. do regular node adding - worst case search time O(h) = O(log_2n) - adding time O(1) - total time O(log_2n + 1) = O(log_2n) 2. check violation and restore if needed - worst case time is always O(h) = O(log_2n) 3. Worst case total time is: - T = O(log_2 n) + O(log_2 n) = O(2*log_2 n) = O(log_2 n).

How do you remove and restore in an AVL Tree?

1. do regular node adding 2. check violation among ancestors of the modified node in bottom-up fashion - if no violation occurs, finish deletion - otherwise, find the deepest problematic node - if the node has left left or right right scenario, do single rotation - if the node had left right or right left scenario, do double rotation at the node - if rotation is applied at a problematic node, apply step 2 on this node be able to draw updated tree, given a tree

How do you add in a B Tree (2-3 tree)?

1. each node can hold up to 2 keys, so we can add key to 2 node child of 3 node 2. adding always starts from a leaf node found by regular search algorithm (even if there is a 2 node along the search path) 3. if leaf is 2 node, just add key to it. if leaf is 3 node, add key as its child (will break property) and restore the property) *KNOW HOW TO RESTORE*

Three scenarios for restoring property after adding in B Tree (2-3 tree)?

1. the 3 node has no parent (root) 2. the 3 node has a 2 node parent 3. the three node has a three node parent

What is the matrix representation of a graph?

A 2D matrix where each cell represents an edge between two vertices. store a graph of n nodes by using an n by n matrix "m" undirected graph - m[i][j] = 1 if node i and node j are connected, m[i][j] = 0 otherwise directed graph - m[i][j] = 1 if directed edge for node i to node j

What is a Max Heap?

A binary tree where the value of each node is greater than its children.

What is a Deap?

A complete binary tree with... 1. Root node is empty (no object). 2. Left subtree of root is a min heap. 3. Right subtree of root is a max heap. 4. In both subtrees, nodes at the same position indexed by i satisfy minH[i] < maxH[i] - minH[1] = 5, maxH[1] = 45 - minH[4]=15, maxH[4] = 20 - empty spot is infinity maxH[6] = ∞

B Tree max height and min height

B tree has max height when each node has only one key - a completely filled binary search tree: h = O(log_2n) B tree has min height when (almost) every node holds max # keys - assume b - 1 keys per node: h = O(log_bn)

What is the definition of AVL Tree?

BST with extra property: - for any node in tree, left subtree and right subtree height must differ by at most one

What is the definition of Splay Tree?

BST with property - every recently accessed node is moved to root. this process is called splaying - sequence of rotations that are designed to make sure rotated BST remains efficient for search on average case

How do you check for violation in an AVL Tree?

Check the heights of the left and right subtrees of each node and compare them.

What are the key properties that guarantee the big O height of a Min Heap?

Complete binary tree and heap property.

How does Dijkstra's algorithm work?

Finds the shortest path from a source vertex to all other vertices in a weighted graph. * refer to slide 20 for process *

What is a Min-Max Heap?

It is a complete binary tree with min levels and max levels 1. even levels are min levels and odd levels are max levels 2. at min level, any node is the smallest node of its subtree 3. at max level, any node is the largest node of its subtree

AVL tree proof h = Ω(log_2n)

Min height is achieved when all sibling subtrees have the same height. It should be the same as the height of a completely filled tree: N = 1 + 2 + 4 + ... + 2 h = 2 (h+1) - 1. This means h = log_2 (n+1) - 1 = Ω(log 2 n).

Space complexity for storing n nodes in Red-Black tree array based

Nb = number of black nodes Nr = number of red nodes On the right most path, we have ... log_2(Nb+1) - 1 number of black nodes log_2(Nr+1) - 1 number of red nodes index of the last red node is O(Nb^2) = O(N^2)

Complexity of Hash Table DAT: time complexity for deletion

O(1) - Same for both best and worst case

Complexity of Hash Table DAT: time complexity for insertion

O(1) - Same for both best and worst case

Complexity of Stack and Queue: time complexity for pushing Assume singly linked list based stack and doubly linked list based queue

O(1) - Same for both best and worst case = add head of linked list

Complexity of Stack and Queue: time complexity for popping Assume singly linked list based stack and doubly linked list based queue

O(1) - Same for both best and worst case = remove list head (stack) or list tail (queue)

Complexity of Hash Table DAT: time complexity for search

O(1) - constant time complexity

What is the space complexity of bubble sort?

O(1).

Complexity of Hash Table LP: time complexity for search

O(h) - h: table size, not necessarily dependent on n

Complexity of Hash Table DAT: space complexity for storing n elements

O(h) - which is O(k) if h = k + 1 k: largest key

Complexity of AVL Tree: time complexity for insertion *difference from BST*

O(h) = O(log_2 n) in both worst case and best case two major diffrences between BST adding and AVL adding BST has O(1) best base time, but AVL has O(log_2n) best case time (we will not find spot right next to root since h = Ω(log_2n) after adding, AVL tree needs extra violation checking and recovery (if necessary), and the recovery can go all the way to root reference slides 24 for node rotation revisit

Time complexity of search in B-Tree

O(h) = O(log_2 n) in worst case and O(1) in best case

Time complexity of removing in Red-Black tree

O(h) = O(log_2n) in all case - not elaborated upon

Time complexity of adding in Red-Black tree

O(h) = O(log_2n) in worst and best case *revisit recoloring*

Time complexity of search in Min Heap

O(h) = O(log_2n) in worst case O(1) in best case

Time complexity of adding in Min Heap

O(h) = O(log_2n) in worst case O(h) = O(log_2n) in best case

Time complexity of removing in Min Heap

O(h) = O(log_2n) in worst case O(h) = O(log_2n) in best case

Time complexity of search in binary search tree

O(h) = O(n) in worst case and O(1) in best case worst case: each round has a pointer check T(1), three comparisons (3*T2), a pointer update (T3). total time is T1 + 3*T2 + T3 = O(1). in worst case there are h rounds so total time is O(1) + ... + O(1) = O(1+...+1) = O(h) = O(n) best case: root has search key. total time is T1 + T2 = O(1) - T1: check if pointer is NULL - T2: check condition key = node

Time complexity of adding in binary search tree

O(h) = O(n) in worst case and O(1) in best case. two major steps to adding in bst 1. apply bst search to find a spot to add. worst case takes O(h) = O(n) time. best case takes O(1) time 2. add the new node: temp->right = &x consumes T1 = O(1) time so, total time is ... - worst case: O(n) + O(1) = O(n) - best case: O(1) + O(1) = O(1)

B Tree height h = ?, assume each node has no more than b-1 keys.

O(log_2n) = Ω(log_bn)

Time complexity of removing in B-Tree

O(log_2n) in worst case and O(h) = O(log_bn) in best case 1. find target key 2. swap it with an in-order predecessor (leaf node) 3. remove at leaf node and address violation

Time complexity of adding in B-Tree

O(log_2n) in worst case and O(h) = O(log_bn) in best case adding has two major steps: - regular adding: O(log_2n) worst case, O(log_bn) in best case - recovery (split and merge): O(log_2n) in worst case, O(1) in best case

Complexity of Hash Table SC: time complexity for search

O(m) - m: size of the longest chain, often m = O(n)

What is the space complexity of counting sort?

O(n + m), where n is the size of the array and k is the range of input values.

What is the time complexity of counting sort?

O(n + m), where n is the size of the array and m is the largest key *recap process slide 26 page 24

What is the space complexity for storing a graph of n pointers and m node objects with a list?

O(n + m).

What is the time complexity of merge sort?

O(n log_2n), where n is the size of the array. *recap process slide 26 page 19 division phase does not consume much time O(log_2n) splits Merge phase - in total O(log_2n) merges, each takes O(n) time

Space complexity for storing n nodes in B-Tree

O(n)

Complexity of Both Linked-List: space complexity for storing n nodes

O(n) - Linear space complexity

Complexity of Singly Linked-List: time complexity of search

O(n) - Worst case: element is at the end, Best case: element is at the beginning

Complexity of Stack and Queue: space complexity for storing n objects in linked-list based implementation Assume singly linked list based stack and doubly linked list based queue

O(n) - linked list space complexity *for array-based implementation, space depends on array size (may be > n).

Complexity of Doubly Linked-List: time complexity of search

O(n) - worst case: need to traverse whole list, best case: key is head or tail

What is the space complexity of merge sort?

O(n).

Time complexity of breadth-first traverse in graph

O(n+m) for list-based and O(n 2 ) for matrix-based BFS consists of two parts 1. time to visit a node (T1) -total time is n*T1 = O(n) 2. time to check a neighbor (T2) - for list-based graph total time is 2*m*T2 = O(m) - for matrix-based graph total time is n^2*T2 = O(n^2) - m in num of edges Total Time: - list based: O(n)+O(m) = O(n+m) - matrix-based: O(n)+O(n^2)=O(n^2 )

Time complexity of depth-first traverse in graph (list and matrix)

O(n+m) for list-based and O(n 2 ) for matrix-based. DFS consists of two parts 1. time to visit a node (T1) -total time is n*T1 = O(n) 2. time to check a neighbor (T2) - for list-based graph total time is 2*m*T2 = O(m) - for matrix-based graph total time is n^2*T2 = O(n^2) - m in num of edges Total Time: - list based: O(n)+O(m) = O(n+m) - matrix-based: O(n)+O(n^2)=O(n^2 )

What is the time complexity of bubble sort?

O(n^2), where n is the size of the array. reference slides 26 page 4 for example process

Complexity of AVL Tree: time complexity for deletion

Removing time is O(h) = O(log_2 n) in both worst case and best case 4 major steps: 1. find target node downwards 2. standard BST removal (fill holes) 3. check violation upwards 4. restore if needed Time complexity is always O(log_2n)

How can a Min Heap be applied for sorting?

Repeatedly remove the root node to get the elements in sorted order.

How does rotation work in an AVL Tree?

Rotation is used to balance the tree by performing left or right rotations on nodes. Know how this works and be able to draw updated tree.

What are the key properties that guarantee the big O height of an AVL Tree?

The heights of the two child subtrees of any node differ by at most one, and the left and right subtrees are also AVL trees.

What are the key properties that guarantee the big O height of a Red-Black Tree?

The tree has the same black height for every path from the root to a leaf, and there are no adjacent red nodes. at each node, all paths to its descendant leaf nodes contain the same number of black nodes

How do you find the max/min node in an AVL Tree?

Traverse to the rightmost/leftmost node in the tree. be able to write a few lines of code to implement

Splay in scenario 1: X only has P (left child) but not G

We just take clockwise rotation at root

How do you add and restore in an AVL Tree?

When AVL property breaks at a node, we may restore it by rotating the tree at that node 1. clockwise rotation - if left subtree is taller 2. counter clockwise - if right subtree is taller

What does it mean for a directed graph to be strongly connected?

a directed graph is strongly connected if there is a directed path from any node to any other node. a directed path follows the edge directions

Complexity of Singly Linked-List: time complexity of adding to head/tail

adding to head - O(1) - Same for both best and worst case adding to tail - O(n) - traverse to tail

How do you remove and restore in a B Tree? What are the different cases we studied?

always delete key at leaf node: if it is not there, keep swapping it with its inorder predecessor/succesor until it is swapped to a lead node (then delete) case 1: leaf is a 3 node case 2: leaf is a 2 node; a sibling is 3 node case 3: leaf is a 2 node; no sibling is 3 node but parent is

What is the amortized time complexity of each operation in splaying?

amortized time of splay tree is just as efficient as AVL tree O(log2 n) * just memorize this *

amortized time analysis

analysis of time over a sequence of operations

How do you add and restore a node in a Min Heap?

because heap is a complete tree, the new node could only be added at its end (the structure of the updated tree must be so) this may break the smaller child property, we can apply up heap bubbling to restore. - first add new node to the end - keep swapping it with bigger parent and stop when it hits a smaller parent or root

What is the time complexity of binary search?

best case O(1) where key is in middle index worst case O(log n) where key is in first position

What is the space complexity of depth-first traverse with stack-based implementation?

consumes O(n) in the worst case. - this is neglectable compared to the storage space for graph - main space cost is from stack used to hold nodes *reference slide 25 page 38 for example process revisit*

What is the space complexity of breadth-first traverse with queue-based implementation?

consumes O(n) in the worst case.- Analysis on BFT is the same as on DFS - main space cost is queue used to hold nodes *reference slide 25 page 49 for example process*

Space complexity for list based binary tree

each node object consumes space M = M1 + 3*M2 M1: space for interger M2: space for node pointer a tree of n nodes consumes space n*M = n*(M1+3*M2) = O(n)

Complexity of Hash Table LP: time complexity for deletion

first apply search ... O(h) in worst case then delete target node O(1) total time is O(h + 1) = O(h) h: table size

AVL tree proof h = O(log_2n)

first consider the number of nodes in an AVL tree of height "h". at any node, its lefty subtree height and right subtree height differs by one N(h): # nodes in min AVL tree of height h fix the root node N(h-1): # of nodes in left subtree N(h-2): # of nodes in right subtree so N(h) = 1 + N(h-1) + N(h-2) implies h < 2*log_2n = O(log_2n)

How do you add and restore in a Red-Black Tree?

first do BST adding and color new node as red Perform recoloring and rotations to balance the tree after adding a node. know how to add and restore (in all cases); can draw the updated tree; specifically, - know when and how to apply recoloring - know when and how to apply rotation + color swapping (tip: you only need to memorize left-left and how left-right is converted to left-left; all other scenarios are mirrored)

What is the definition of B Tree?

generalization of BST with following properties 1. a node can store up to k keys - a node with k keys is calles a k+1 node 2. a node with k keys must have k+1 child nodes, located between the sorted keys - nodes between keys a & b are in (a,b) 3. all leaf nodes have the same depth - encourages small height

What does it mean for a graph to be connected?

graph is connected is there is a path from any node to any other node. this path may or may not follow the edge directions

Red Black tree h = ?

h = O(log_2n)

Big O tree height Min Heap

h = O(log_2n) = Ω(log 2 n)

AVL tree height h = ?

h = O(log_2n) = Ω(log_2n)

Binary tree height h = ?

h = O(n) = Ω(log 2 n)

Complexity of Singly Linked-List: time complexity of removing from head/tail

head - O(1) - Same for both best and worst case tail - O(n) - traverse to tail

Complexity of Doubly Linked-List: time complexity of adding to head/tail

head - O(1) - more steps than singly, but same complexity tail - O(1) - tail pointer

Complexity of Doubly Linked-List: time complexity of removing from head/tail

head - O(1) - more steps than singly, but same complexity tail - O(1) - tail pointer

What is a Min Heap?

heap ds is used to store prioritized objects in a way that nodes with higher priorities are closer to the root. a min heap is a complete binary tree with the property that any node is smaller than its descendant nodes.

Complexity of Hash Table SC: time complexity for insertion

if add key to head - add to list head - O(1) if add key to tail - add to list tail - O(m) m: size of the longest chain

Remove and restore B tree when leaf is a 3 node

just delete, leaf becomes a 2 node

How do you use a queue to implement breadth-first traverse?

key is to use queue to keep track of the nodes to visit Push a node into an empty queue Q Recursion - visit node X at the front of Q - push X's unvisited neighbors into Q (for efficiency, we can mark them as visited now) - pop X out of the Q - stop when Q becomes empty * slide 19 page 67 for visual *

What is the definition of degree in a graph?

let x be any node degree = # of edges with X as endpoint - D(7) = 3, D(15) = 2 incoming degree = # incoming edges at X - ID(7) = 2, ID(15) = 0 outgoing degree = # outgoing edges at X - OD(7) = 1, OD(15) = 2

Space complexity for storing n nodes in Min Heap

list based and array based: O(n) because it is a complete tree, no cell is wasted in array

Complexity of AVL Tree: space complexity for storing n elements

list-based is O(n) and array-based is O(n^2) array-based - analysis is same as on BST. space = array size * cell space - an array cell takes M space - array size = max node index + 1 worst case max index is 2^(h+1) - , where h < 2*log_2n so max index is O(n^2)

Space complexity for storing n nodes in Red-Black tree

list: O(n) array: O(n^2) *similar to AVL*

Min Heap height

min heap is a complete binary tree height of a complete tree of n nodes is O(log_2n) and Ω(log 2 n)

Complexity of Hash Table SC: space complexity for storing n elements assuming table stores nodes only

n nodes outside the table, each - key integer (M2) - next pointer (M3) - any satellite data (M4 in total) - in total, (M2 + M3 + M4) * n in total, n * (M2 + M3 + M4) = O(n) h: table size

What is the space complexity for storing a graph with a matrix?

need an n by n matrix. always O(n^2) space

Time complexity of removing in binary search tree *based on naive hole-filling algorithm*

no matter where the target node is, we always have T = O(h) three major steps to removal: 1. find target node and remove it 2. replace hole with max/min in left/right subtree 3. repeat 2 until a leaf node is used to fill a hole

How do you find neighbors in a list-based implementation of a graph?

nodes connected to node i are on a list at table[i]

How do you remove and restore a node in a Min Heap?

on a heap, we only remove the root node (highest priority) at each time after removal - down-heap bubbling to restore - first fill the hole with the last node - keep swapping it with smaller child (smallest between two) and stop when it hits leaf or both children are bigger

What is the definition of Red-Black Tree?

red black tree is bst with extra properties: - a node is colored either red or black - root is black - all leaf nodes are black - red node can only have a black child - at each node, all paths to its descendant leaf nodes contain the same number of black nodes

Space complexity for storing n nodes in binary search tree

same as binary tree O(n) for list based O(2^n) for array based

Space complexity for array based binary tree

space = array size * cell space each array cell stores a node object, which consumes M space (same as list based) array size = max node index + 1 - in the worst case, max node index is 2^n - 1 so array size is 2^n so total space is M*2^n = O(2^n)

What is the list representation of a graph?

store a graph of n nodes using a size n table where nodes connected to node i are on a list at table[i]. (looks like a hash table with SC)

Complexity of Hash Table SC: space complexity for storing n elements assuming table stores pointer

table of h pointers, each 1 space (h * M1) n nodes outside the table, each - key integer (M2) - next pointer (M3) - any satellite data (M4 in total) - in total, (M2 + M3 + M4) * n in total, h * M1 + n * (M2 + M3 + M4) = O(h + n) h: table size

How do you use a stack to implement depth-first traverse in a graph?

use stack to keep track of nodes for backtracking suppose we are at node i 1. if i has not been visited, visit it and push it into stack (suppose we have a way to mark visit history) 2. if i has been visited, find an unvisited neigbor node j (randomly and by searching in row i of "m") - if j is found, apply steps 1-2 on it - if j is not found, backtrack by popping i from stack (the new stack top is i's parent) 3. stop if backtrack to the first node has no unvisited neighbors * slide 19 page 43 for visual *

Complexity of AVL Tree: time complexity for deletion (worst case) *regular remove + addressing violations*

visit N1 nodes to find target. take O(N1) time visit N2 nodes to find replacement. take O(N2) time Total time for regular remove worst case: O(N1+N2) = O(h) visit N3 nodes to find violation and restore. O(N3) time visit N4 nodes to make sure no more violation upwards. take O(N4) time Total time for addressing violation worst case: O(N3+N4) = O(h)

How do you find neighbors in a matrix-based implementation of a graph?

we can find the neighbors of node i by searching in row i of matrix m if m[i][j] = 1, then node j is a neighbor

Splay in scenario 6: X has both P(right) and G(left)

we first counter clockwise rotate at P, then clockwise rotation at G

Splay in scenario 3: X has both P(left) and G (left)

we take two clockwise rotations at G ( not at P and G)

Complexity of Hash Table LP: time complexity for insertion

worst case - only empty cell is above so check h cells - check if cell is empty (T1) - update index to next cell (T2) total probing time is (T1 + T2) * n + T1 = O(n) - after probing, add to empty cell (T3 = O(1)) - total time = O(n) + O(1) = )(n+1) = O(n) (at most check n + 1 cells before

Time complexity of search in Red-Black tree

worst case: O(h) = O(log_2n) best case: O(1)

Complexity of AVL Tree: time complexity for search

worst case: O(log_2 n) best case: O(1)


Conjuntos de estudio relacionados

Perception (Chapter 3- Test Questions)

View Set

The Official CompTIA Linux+ Student Guide Exam XK0-004 Lesson 3

View Set

Advanced Google Analytics (Assessment 2)

View Set

Ch 10 - 10. 7 Valence Bond Theory: Hybridization

View Set

Chapter 5 - Existence and Proof by Contradiction

View Set

Chapter 15: Advertising and Sales Promotion

View Set

Spanish || Writing comprehension

View Set

Conceptual physics chapter 4 4.2 study guide

View Set