Algorithms: Khan Academy
What does n^0 equal?
1
What are the most common rates of growth?
1. O(n) 2. O(log(n)) 3. O(n) 4. O(nlog(n)) 5. O(n^2) 6. O(n^3) 7. O(2^n)
What makes a good algorithm?
1. correctness 2. efficient
What algorithms do you use in every day life? Do you think you could write a program to make them more efficient?
1. getting ready 2. commute to work 3. studying 4. designing mockups 5. scheduling meals 6. writing code
What things do you need to understand to implement an algorithm?
1. inputs 2. outputs 3. variables need to be created 4. initial values of variables 5. what steps should be take to compute other values and produce output 6. do these steps repeat?
How would you make interviewing more efficient and raise chances of success?
1. memorize common solutions 2. grasp recursion 3. have complex solutions on hand
How would you make designing mockups more efficient?
1. pick color scheme 2. use inspiration to -> 2. create basic layout 3. use inspiration to -> 4. build reusable components 5. apply reusable components to layout
Suppose we need at most m guesses for an array of length n. Then, for an array of length __, the first guess cuts the reasonable portion of the array down to size n, and at most m guesses finish up, giving us a total of at most m + 1 guesses.
2n
If an element in array is at index 3, then it has how many elements before it?
3
What's the Big-O time of binary search?
Log(n)
What's the Big-O space of binary search?
O(1) if it's one array. one will always take up same space.
What's the Big-O for insertion sort?
O(n^2)
What's the run time for selection sort?
O(n^2)
How do we think about the running time of an algorithm?
a function of the size of it's input
What's a trick to add up consecutive numbers?
add the smallest and largest, then multiply by the number of pairs
How do you measure efficiency?
asymptotic analysis
What's it called when you drop the constant coefficients and less significant terms from rate of growth?
asymptotic notation
What do we call the first case, where we immediately know the answer, in a recursive function?
base case
What's the mathematical way to write the number of times we repeatedly halve, starting at n, until get the value one?
base-2 logarithm of n
Why would you set the max index to the length - 1?
because arrays start at a 0 index. if there are 25 numbers, the last number is at index 24.
Why do we say asymptotically with Big-O notation?
because it matters for only large values of n
What is the notation for the upper bound?
big-o
What is the notation for the lower bound?
big-omega
Which algorithm works by repeatedly dividing in half the portion of the list that could contain the item, until you've narrowed down the possible locations to one?
binary search
Given a range of 26 to 80, if your guess of 53 is too high, what happens?
eliminate the numbers from 53-80 | update the maxValue to the currentIndex - 1 | if the guess is greater than desired number, cut the top half
What's the name for the function when we're trying to count how many different orders there are for things or how many different things we can combine things?
factorial
What kind of sort loops over positions in the array, starting with 0, moving each new position to the current position if the value is lower?
insertion sort
What kind of sort repeatedly inserts an element in the sorted subarray to its left?
insertion sort
Suppose we need at most m guesses for an array of length n. Then, for an array of length 2n, the first guess cuts the reasonable portion of the array down to size n, and at most m guesses finish up, giving us a total of at most ___ guesses.
m + 1
If an array contains 25 numbers, what are the starting indices for a binary search?
min: 0 max: 24
What does n^1 equal?
n
Does the base of a logarithm function matter in Big-O?
no, because it's constant, and we throw out constant values.
What is the term for how fast a function grows with the input size?
rate of growth
What is a function that calls itself, until it doesn't.
recursion
What is the technique called for solving a smaller instance of the problem, until the problem is so small that we can solve it directly?
recursion
What do we call the second case, where we have to compute the same function but on a different value?
recursive case
What do we do in insertion sort if the element is less than key or equal?
stop sliding and copy key into the vacated position to the right of this element.
Why is a while loop better for binary search?
the indices guessed by binary search don't go in sequential order that would make a for-loop convenient
What is a logarithmic function trying to find?
the value of the exponent
When we walk through insertion sort, right to left, what are we doing?
we're sliding each element that is greater than the key(current) to the right.