Algorithms and their properties
Heap select
Description: Similar to quick select, uses partitioning technique of max heap data structure. Time complexity: O(n + k log n) Divide-Conquer Not in-place Stable
Quick Select
Description: Similar to quicksort, it selects a pivot checking if equal to kth element and recursively calls itself until it is equal. Time complexity: Worst case = n^2 Average case = n Best case = n Decrease-Conquer Generally In-place Unstable
Stable Sorting Algorithm
preserves the relative order of any two equal elements in its input
Decrease and Conquer
to recursively reduce a problem to two or more smaller instances of the same problem until the problem can be solved
Divide and Conquer
A program design strategy in which tasks are broken down into subtasks, which are broken down into sub-subtasks, and so on, until each piece is small enough to code comfortably. These pieces work together to accomplish the total job.
Insertion Sort
Description: Best sorting algorithm for small or mostly sorted array. Assumes that element is already partially sorted. Time complexity: Worst case = n^2 Average case = n^2 Best case = n In-place Stable
Sequential sort
Description: Also known as bubble sort, iterates through array, swaps elements if wrong, and continues until no more swaps. Time complexity: O(n^2) Not in-place Stable
Karatsuba's Algorithm
Description: Best for sorting large arrays. Divides array into two halves and sorts them recursively, then merges halves back together. Time complexity: Worst case = n ^ 1.6 Average case = n ^ 1.6 Best case = O(1) Divide-Conquer Not in-place
Merge Sort
Description: Best for sorting large arrays. Divides array into two halves and sorts them recursively, then merges halves back together. Time complexity: Worst case = n log n Average case = n log n Best case = n log n Divide-Conquer Not in-place Stable
Heap sort
Description: Best for sorting large arrays. Divides array into two halves and sorts them recursively, then merges halves back together. Time complexity: Worst case = n log n Average case = n log n Best case = n log n In-place! Not Stable
Heap Sort
Description: Builds binary heap and extracts the maximum element from heap placing it in correct position. Best used for large arrays that do not fit in memory. Time complexity: Worst case = n log n Average case = n log n Best case = n log n In-place Unstable
Timsort
Description: Hybrid algorithm that combines both merge sort and insertion sort. Time complexity: n log n Divide-Conquer Not in-place Stable
Introsort
Description: Hybrid algorithm that uses quicksort, and switches to heapsort when recursion exceeds a certain limit. Time complexity: Best case = n Worst case = n log n Divide-Conquer Not in-place Not Stable
Binary search
Description: It repeatedly divides the array into two halves and checks if the target value is in the left or right half, until the target value is found or the search space is exhausted. Time complexity: Worst case = log n Average case = log n Best case = O(1) Divide-Conquer Not in-place Stable
Grade school algorithm
Description: Multiplication like you did in school where you multiply two numbers by hand. Time complexity: O(n^2)
Radix Sort
Description: Non-comparative sorting algorithm that uses buckets. Useful for fixed digit array. Time complexity: O(kn) Usually not in-place Stable or unstable
Euclid's Algorithm
Description: Recursive algorithm used to find the greatest common divisor of two positive integers. GCD(a, b) = GCD(b, a mod b) Time complexity: Worst case = log n Average case = log n Best case = O(1) Decrease and conquer
Quick Sort
Description: Selects a pivot element, partitions the remaining elements into two sub arrays, and does this term-1recursively until array is sorted. Median of three pivot useful for finding median of an array in linear time. Random pivot useful for avoiding worst case. Time complexity: Worst case = n^2 Average case = n log n Best case = n log n Divide-Conquer In-place Unstable
Topological sort
Description: a graph algorithm that orders the vertices of a directed acyclic graph (DAG) such that for every directed edge (u, v), the vertex u comes before v in the ordering. It is a divide-and-conquer algorithm in the sense that it repeatedly removes a vertex with no incoming edges and its incident edges from the graph, until all vertices have been removed. Time complexity: O(V+E) Divide-Conquer Not in-place Stable
Max-Heapify
Description: calculates the largest integer less than or equal to the square root of a non-negative integer n Time complexity: O(log n)
Integer Square Root
Description: calculates the largest integer less than or equal to the square root of a non-negative integer n Time complexity: Worst case = O(log n) Average case = O(log n) Best case = O(1)
Interpolation search
Description: uses the principle of binary search, but instead of always dividing the search interval in half, it estimates the position of the target value based on its value relative to the values at the endpoints of the interval. Time complexity: Worst case = n Average case = log log n Best case = O(1) Divide-Conquer Not in-place Stable
In place algorithm
does not require significant additional memory to operate and operates directly on the input data.