cs

Réussis tes devoirs et examens dès maintenant avec Quizwiz!

An algorithm's runtime complexity is a function, T(N), that represents

# constant time operations performed by algorithm on input size N.

Given a key, search algorithm returns first node whose data matches that key, or returns null if matching node not found. simple linked list search algorithm checks current node

(initially list's head node), returning node if match, else pointing current to next node, etc. If pointer to current node is null, returns null (matching node not found).

A hash table's load factor is # items in hash table divided by number of buckets. Hash table w 18 items and 31 buckets has a load factor 18/31=0.58. An implementation may choose to resize when one or more of the following values exceeds certain threshold:

1. Load factor 2. When using open-addressing, number of collisions during an insertion 3. When using chaining, size of bucket's linked-list

Set

ADT for unordered collection of distinct (no duplicates) items. Binary search tree, hash table.

Queue

ADT in which items are inserted at end of queue and removed from front of queue. Linked list

Stack

ADT in which items are only inserted on or removed from top of stack. Linked list

Deque (double-ended queue)

ADT in which items can be inserted and removed at both the front and back. Linked list

Dictionary (Map)

ADT that associates (or maps) keys with values. Hash table, binary search tree

Given new node, singly-linked list Prepend inserts the new node before list's head node. Prepend behavior differs if list is empty vs or not:

Empty: If head pointer is null, points head and tail pointers to new node. Non-empty: If head pointer is not null, points new node's next pointer to head node, and points list's head pointer to new node.

Certain binary tree structures can affect speed of operations on tree. Special types of binary trees:

Full: every node contains 0 or 2 children. Complete: all levels, except possibly last , contain all possible nodes and all nodes in last level are as far left as possible. Perfect: all internal nodes have 2 children, all leaf nodes are at same level.

Given new node, singly-linked list InsertAfter inserts new node after given existing list node. curNode is pointer to existing list node. Three insertion scenarios:

Insert as list's first node: If head pointer is null, points the list's head and tail pointers to the new node. Insert after list's tail node: If head pointer not null (list not empty) and curNode points to tail node, points tail node's next pointer and tail pointer to new node. Insert in middle of list: If head pointer not null and curNode does not point to list's tail node, points new node's next pointer to curNode's next node, and points curNode's next pointer to new node.

InsertAfter(list, w, x)

Inserts x after w

GetLength: Return number of nodes (expected to be stored in the heap's member data): Worst Case:

O(1)

IsEmpty: Return true if no nodes in heap, else false : Worst Case:

O(1)

Peek: Return value in root node: Worst Case:

O(1)

Priority queue: COMMONLY implemented w heap. Heap will keep highest priority item in root node and allow access in

O(1) time. Adding and removing items from queue will operate in worst-case O(logN) time.

Pop: Remove: Worst Case:

O(logN)

Print(list)

Prints list's items in order

PrintReverse(list)

Prints list's items in reverse order

Remove(list, x)

Removes x

Pop(queue)

Returns and removes item at front of queue

Pop(stack)

Returns and removes item at top of stack

Peek(queue)

Returns but does not remove item at the front of the queue

Peek(stack)

Returns but does not remove item at top of stack

Search(list, x)

Returns item if found, else returns null

Sort(list)

Sorts the list's items in ascending order

In a BST without parent pointers, search for a node's parent can be implemented recursively, searching for a parent in a way similar to normal BSTSearch algorithm. But instead of comparing

a search key against candidate node's key, node is compared against candidate parent's child pointers.

Alg's runtime may vary significantly based on input data. Best case: alg does min possible # operations. Worst case: algorithm does max possible ops. Best or worst case describes contents of algorithm's input data only. Input data size must remain

a var, N, or most algs would have best case N=0 (no input data processed). "best case: alg doesn't process any data" is not useful. Complexity analysis always treats input data size as var.

Data structure: way of organizing, storing, and performing operations on data. Operations performed on data structure include

accessing or updating stored data, searching for specific data, inserting new data, removing data.

Forward traversal through linked list can be implemented using a recursive function that takes node as an argument. If non-null, node is visited first. Then recursive call is made on node's next pointer, to traverse rest of list. ListTraverse func takes a list as

argument, and searches entire list by calling ListTraverseRecursive on list's head.

A mid-square hash squares key, extracts R digits from result's middle, and returns remainder of middle digits divided by hash table size N. For hash table w 100 entries and key of 453, decimal (base 10) mid-square hash func computes 453 * 453 = 205209, and returns middle two digits 52. For N buckets, R must

be greater than or equal to ⌈log10N⌉ to index all buckets. The process of squaring and extracting middle digits reduces the likelihood of keys mapping to just a few buckets.

BST insert algorithm traverses tree from root to leaf node to find insertion location. One node visited per level. BST with N nodes has at least log2N levels and at most N levels. Runtime complexity of insertion:

best case O(logN), worst case O(N). Space complexity of insertion is O(1); only one pointer is used to traverse tree to find insertion location.

Graph:

data structure for representing connections among items, and consists of vertices connected by edges. Vertex: item in a graph. Edge: connection between two vertices in graph.

Binary tree

data structure in which each node stores data and has up to two children, known as a left child and a right child.

Hash table

data structure that stores unordered items by mapping (or hashing) each item to a location in an array.

Abstract data type (ADT)

data type described by predefined user operations, such as "insert data at rear," w/o indicating how each operation is implemented. Can be implemented w different underlying data structures; programmer needs no knowledge of underlying implementation

Queue push operation removes and returns item at

front of queue. First-in first-out ADT. Can be implemented w linked list, array, or vector. Waiting in line at the grocery store. A person enters at the end of the line and exits at front.

BST search worst case = H + 1 comparisons = O(H) comparisons, where H is tree height. N-node binary tree's height may be as small as O(logN); extremely fast searches. 10,000 node list may require 10,000 comparisons, but BST may need only 14. A binary tree's

height can be minimized by keeping all levels full, except possibly last level. Such an "all-but-last-level-full" binary tree's height is H=⌊log2N⌋.

A direct hash function uses item's key as bucket index. If key is 937, index is 937. Hash table w direct hash function: direct access table. Given key, direct access table search alg returns

item at index key if bucket not empty and null (item not found) if empty. No collisions: Each key is unique and gets a unique bucket . However: All keys must be non-negative ints, but for some applications keys may be negative. Hash table's size equals largest key value plus 1; may be very large.

The approach for hash table algorithm determining whether cell is empty depends on implementation. If items are non-neg integers, empty can be represented as -1. More commonly, items are each

object w multiple fields (name, age...), in which case each hash table array element may be pointer. Using pointers, empty can be represented as null.

A stack is often implemented using a linked list, with head node as top. Push: done by prepending item to list. Pop: done by

pointing local var to head node, removing head node from list, and returning local var.

Given key, search algorithm returns first node found matching that key, or null if matching node not found. Simple BST search algorithm checks current node (initially tree's

root), returning node as match, else assigning current node with left (if key is less) or right (if key is greater) child, etc. If child is null, returns null (matching node not found).

binary search tree (BST), any node's left subtree keys ≤ the node's key, and the right subtree's keys ≥ the node's key. fast searching:

to search nodes means to find node w a desired key, if node exists. A BST may yield faster searches than list. Searching a BST starts by visiting root node (which is the first currentNode)

BST search can be implemented using recursion; single node and search key are passed as args to the recursive search function. Two base cases. First: when the node is null; null is returned. If node non-null, search key is compared to node's key. Second:

when search key equals the node's key; node is returned. If search key < node's key, recursive call is made on node's left child. If search key greater than node's key, recursive call is made on node's right child.

Edge: the link from a node to a child Node's depth = # edges on the path from root to node. root node has depth of

0. All nodes w same depth form tree level. Tree's height is largest depth of any node. A tree w just one node has height 0.

Dynamic array

ADT for holding ordered data and allowing indexed access. Array

List

ADT for holding ordered data. Array, linked list

Bag

ADT for storing items in which order does not matter and duplicate items are allowed. Array, linked list

Given key, BST remove removes first-found matching node, restructuring tree to preserve BST ordering property. Algorithm searches for a matching node like the search algorithm. If node X found, algorithm does one of following: Remove leaf node: If X has parent (X is not root), parent's left or right child (whichever points to X) is assigned with null. Else root pointer is assigned w null, and BST is empty. Remove internal node w single child:

If X has a parent (so X is not root), parent's left or right child (whichever points to X) is assigned w X's single child. Else root pointer is assigned w X's single child. Remove an internal node with 2 children: algorithm locates X's successor (the leftmost child of X's right subtree), and copies successor to X. Then recursively removes successor from right subtree.

Append(list, x)

Inserts x at end of list

Push(queue, x)

Inserts x at end of the queue

Prepend(list, x)

Inserts x at start of list

Push(stack, x)

Inserts x on top of stack

Push: Insert: Worst Case:

O(logN)

Memory allocation: app requesting + being granted memory. Mem used by Python app must be granted by OS. When application requests a specific amount of mem from OS, OS can grant or deny. Some langs make programmer write mem-allocating code;

Python runtime handles memory allocation. Creating Python list and then appending 100 items means mem for the 100 items must be allocated. Python runtime allocates mem for lists and other objects as needed, and must request from OS

Given existing node X in singly-linked list, RemoveAfter removes node after X. existing node must be specified; each node in singly-linked only has pointer to next node. curNode points to existing list node. Because head node is not after another node, if curNode is null, RemoveAfter does special case that removes head node. Otherwise, removes node after curNode. Scenarios:

Remove list's head node (special case): If curNode is null, points sucNode to head node's next node, and points head pointer to sucNode. If sucNode is null, the only list node was removed, so tail pointer is pointed to null (list is now empty). Remove node after curNode: If curNode's next pointer is not null (a node after curNode exists), points sucNode to the node after curNode's next node. Then curNode's next pointer is pointed to sucNode. If sucNode is null, tail node was removed, so points tail pointer to curNode (new tail node).

GetLength(stack)

Returns number items in stack

GetLength(list)

Returns the number of items in the list

GetLength(queue)

Returns the number of items in the queue

IsEmpty(list)

Returns true if list has no items

IsEmpty(queue)

Returns true if queue has no items

IsEmpty(stack)

Returns true if stack has no items.

A good, fast hash function minimizes collisions; can make perfect hash function (O(1) runtime for insert, search, and remove. Maps items to buckets with no collisions) if number items and all possible item keys are known beforehand. A good hash function should uniformly distribute items into buckets. With chaining: short bucket lists, fast inserts, searches, and removes. With linear probing:

avoid hashing multiple items to consecutive buckets, minimizing avg linear probing length for fast inserts, searches, and removes. On avg, good hash func will achieve O(1) inserts, searches, and removes; worst-case may require O(N). Hash function's performance depends on hash table size and knowledge of expected keys. A modulo hash uses remainder from division of key by hash table size N.

BST remove algorithm traverses tree from root to find node to remove. When node being removed has 2 children, node's successor is found and recursive call is made. One node is visited per level; worst case tree is traversed twice from root to leaf. BST w N nodes has >= log2N levels and <= N levels. Runtime complexity of removal:

best case O(logN), worst case O(N). 2 pointers are used to traverse tree in removal. When node being removed has 2 children, third pointer and a copy of one node's data are also used, and one recursive call is made. Space complexity of removal: always O(1).

Linked list

data structure that stores ordered list of items in nodes, where each node stores data and has pointer to next node. Better than array for fast data insertion. Insert at start of 2 item array, move 2. insert at start of 2 item linked list, move 0.

Array

data structure that stores ordered list of items, where each item is directly accessible by positional index.

Record

data structure that stores subitems, with name associated with each subitem.

Chaining handles hash table collisions by using list for each bucket; each list may store multiple items that map to same bucket. Insert uses item's key to

determine bucket, then inserts item in bucket's list. Searching also first determines bucket, then searches bucket's list. Likewise for removes.

A collision occurs when item being inserted into a hash table maps to same bucket as existing item in hash table. Various collision resolution techniques are used to handle collisions during insertions, such as chaining or open addressing. Chaining:

each bucket has list of items (so bucket 5's list would become 55, 75). Open addressing: looks for empty bucket elsewhere in table (so 75 might be stored in bucket 6)

BST insertion and removal can also be implemented using recursion. Insertion uses recursion to traverse down tree until insertion location is found. Removal uses recursive search functions to

find node and node's parent, then removes node from tree. If node to remove is internal node w 2 children, node's successor is recursively removed.

Alg's space complexity: func S(N) = # fixed-size mem units used for input size N. Space complexity of alg duplicating list of nums: S(N) = 2N + k. Const k = mem for things like loop counter, list pointers. Space complexity:

input data and additional mem allocated by alg. Alg's auxiliary space complexity: space complexity besides input data. Ex: Alg to find max num in list has space complexity S(N) = N + k, auxiliary space complexity S(N) = k, where k is a const

Algorithm's efficiency: typically measured by

its computational complexity.

Stack push inserts item on top of stack. Stack pop removes and returns item at top of stack. Referred to as

last-in first-out ADT. Can be implemented w linked list, array, or vector.

A BST defines an ordering among nodes, from smallest to largest. A BST node's successor is node that comes after in BST ordering, so in A B C, A's successor is B, and B's successor is C. BST node's predecessor is the node that comes before in BST ordering. If a node has right subtree, the node's successor is that right subtree's

leftmost child: Starting from right subtree's root, follow left children until reaching a node with no left child (may be that subtree's root itself). If a node doesn't have right subtree, node's successor is the first ancestor w this node in a left subtree. Another section provides algorithm for printing BST's nodes in order.

Heap

max-heap: tree where node's key >= node's childrens' keys. min-heap: tree where node's key <= node's childrens' keys. For priority queue: if smallest numerical value is highest priority, min value should be in heap's root node. min-heap has min val in root node.

Mid-square hash function: usu implemented w binary (base 2), not decimal; binary implementation is faster. decimal implementation requires converting square of key to string, extracting substring for middle digits, and converting substring to integer. A binary implementation only requires a few shift and bitwise AND operations. Extracts

middle R bits, returns remainder of middle bits divided by hash table size N, where R >= ⌈log2N⌉. Ex: For hash table size of 200, R = 8, 8 bits are needed for indices 0 to 199. Extracted middle bits depend on max key. Ex: A key w val 4000 requires 12 bits. A 12-bit num squared needs up to 24 bits. For R = 10, middle 10 bits of 24-bit squared key are bits 7 to 16.

Multiplicative string hash repeatedly multiplies hash value and adds ASCII (Unicode) value of each char in str. Multiplicative hash func for strings starts w large initial value. For each char, hash function multiplies current hash value by

multiplier (often prime) and adds char's value. Finally, function returns remainder of sum divided by hash table size N. Daniel J. Bernstein created popular version that uses an initial value of 5381 and multiplier of 33. Good for short English strings.

Hash table resize operation increases number of buckets, preserving all existing items. A hash table with N buckets is commonly resized to

next prime number ≥ N * 2. New array is allocated; all items from old array are re-inserted into new array, making resize operation's time complexity O(N).

In list, each node has <= one successor. In binary tree, each node has <= 2 children, left child and right child. Leaf: tree node w no children. Internal node: node w at least one chld. Parent:

node w a child is that child's parent. A node's ancestors include node's parent, the parent's parent... up to tree's root. Root: the top node w no parent

Singly-linked list: data structure for implementing list ADT where each node has data and

pointer to next node. list structure typically has pointers to list's first and last node (head and tail). Type of positional list: elements have pointers to next and/or previous elements in list. null: value indicating pointer points to nothing

A queue is often implemented using a linked list, with head node as front, and tail node as end. Push: done by appending item to list. Pop: done by

pointing local var to head node, removing head node from list, and returning the local variable.

Priority queue

queue where each item has a priority, and items with higher priority are closer to front of queue than items with lower priority. Can have duplicates. Heap.

A common hash function uses modulo operator %, which computes int remainder when dividing two numbers. For a 20 element hash table, hash function of key % 20 will map keys to bucket indices 0 to 19. A hash table's operations of insert,

remove, and search each use hash function to determine item's bucket. Inserting 113 first determines bucket to be 113 % 10 = 3.

Push(PriorityQueue,x) inserts item closer to front than all lower priority items, closer to end than all equal or higher priority items. Pop(PriorityQueue) removes, returns highest priority item (at front of queue). Priority queue usu supports peeking and length querying. Peek returns highest priority item, w/o

removing it from front of queue. Can be implemented so you can get each item's priority from item itself. Customer object may include service priority number. May be implemented so priorities are specified in call to PushWithPriority: push operation w arg for pushed item's priority.

Trees are commonly used to represent hierarchical data. A tree can

represent files and directories in a file system, since a file system is a hierarchy.

Computational complexity: amount of

resources used by algorithm. Most common resources considered: runtime + memory usage.

Recursive linked list search: implemented similar to forward traversal. Each call examines 1 node. If node null, null is returned. Else node's data is compared to

search key. If match, node is returned, else rest of list is searched recursively.

Binary space partitioning (BSP): technique of repeatedly separating region of space into 2 parts and cataloging objects contained in regions. BSP tree: binary tree used to store info for binary space partitioning. Each node in BSP tree contains info on a region of space and which objects are contained in region. Graphics applications: BSP tree can be used to

store all objects in multidimensional world. BSP tree can be used to efficiently determine which objects must be rendered on screen; viewer's position in space is used to perform a lookup within the BSP tree. Lookup quickly eliminates large number of objects not visible and that should not be rendered.

Most of the tree data structures discussed in this book serve to store a collection of values. Numerous tree types exist to

store data collections in a structured way that allows for fast searching, inserting, and removing of values.

Python: managed lang: objects are deallocated automatically by Python runtime, and not by programmer's code. Object that is no longer referenced by any variables: candidate for deallocation. Reference count: integer counter that represents how many vars reference object. When object's reference count is 0,

that object is no longer referenced. Python's garbage collector deallocates objects with ref count 0. Time btwn an object's reference count becoming 0 and object being deallocated may differ across diff Python runtime implementations

Given a new node, a BST insert operation inserts the new node in a proper location obeying the BST ordering property. A simple BST insert algorithm compares new node w current node (initially the root). Insert as left child: If new node's key < current node, and the current node's left child is null, assigns that node's left child with

the new node. Insert as right child: If new node's key > current node, and the current node's right child is null, the algorithm assigns the node's right child with the new node. Search for insert location: If the left (or right) child is not null, the algorithm assigns the current node with that child and continues searching for a proper insert location.

Hash table: data structure that stores unordered items by mapping / hashing each item to location in array (or vector). Given array w indices 0..9 to store ints from 0..500, modulo (remainder) operator can be used to map 25 to index 5 (25 % 10 = 5), and 149 to index 9 (149 % 10 = 9). Main advantage: searching (or inserting / removing) item may need only O(1), vs O(N) for searching list or O(log N) for binary search. In hash table, item's key is

val used to map to index. For all items that might be stored in hash table, every key is ideally unique, so algorithms can search for specific item by key. Each hash table array element is bucket. Hash function computes bucket index from item's key.


Ensembles d'études connexes

Section 8: Creation and Termination of Agency in Texas

View Set

HonorLock Quiz 4 On the Constitution

View Set

MARK 300 Supply Chain Management Portion

View Set

geo chapter 5 volcanic eruption, Chapter 8, Geology set 2-metamorphic rock, Chapter 10 Earthquakes, Soil Horizons, Geology Ch 5, Geology Ch 4, Geology Exam 2 Review, Geology Mountain Building, Geology Ch 5, Geology Ch 4, Metamorphic rocks, chapter 7,...

View Set

Production & Operations Ch. 13-15

View Set

Quiz 1- Introduction to Religion

View Set

Software Engineering Rapid Fire Questions

View Set

Exploring Linux Filesystems (review questions) - [LINUX System Administration]

View Set

STRAT 5701 Week 5 - Cost Leadership

View Set