Big O Notation (Basics)
The execution time of O(1) algorithms (constant time complexity) is ___________________ of the size of the input.
independent
Name three examples of O(1) time algorithms.
1. accessing a value with an array index 2. push() 3. pop()
__________ search is an example with complexity O(log n).
Binary
In practice, what does knowing Big O Notation help with for a software engineer?
Enables them to determine how efficient different approaches to solving a problem are
The third-fastest possibility in Big O is O(_____), which is _____________ time complexity.
O(n) linear
Three examples of quadratic time complexity include _________ sort, __________ sort, and __________ sort.
bubble selection insertion
Name a practical example of O(n) time algorithms.
for loops! (Finding a certain value in an array, printing all values in an array, etc)
The time it will take to run O(n) algorithms (linear time complexity) will increase proportionately as the size of input n __________________.
increases
O(log N) basically means time goes up ____________ while the n goes up _________________.
linearly exponentially for (var i = 1; i < n; i = i * 2) console.log(i);}
O(log n) (logarithmic running time) essentially means that the running time grows in proportion to the _____________ of the input size.
logarithm
What is the number one example to think of when talking about O(n^2) algorithms (quadratic time complexity)?
nested for loops
O(1) algorithms (constant time complexity) will always take the _____ _________ of time to be executed
same amount
With O(n^2) algorithms (quadratic time complexity), the execution time is proportional to the _____________ of the input size.
square
The fastest possibility in Big O is O(_____), which is _____________ time complexity.
O(1) constant
Big O Notation is a way to represent what?
How long an algorithm will take to execute
There are seven different possibilities when it comes to Big O Notation -- what are they, in order from fastest to slowest?
O(1) O(log n) O(n) O(n log n) O(n^2) O(2^n) O(n!)
If 10 items takes at most some amount of time x , and 100 items takes at most 2x, and 10,000 items takes at most 4x, what time complexity is this starting to look like?
O(log n) (It's increasing with the size of the input, but not proportionally to the input.)
The second-fastest possibility in Big O is O(_____), which is _____________ time complexity.
O(log n) logarithmic
Checking to see if there are any duplicates in a deck of cards by looping *once* and then looping *within* each card of that loop to check for the duplicate would be what time complexity?
O(n^2)
True or False: With quadratic time complexity, the number of nested for loops is the degree to which the time grows.
True Ex: Four nested loops creates O(n^4) time complexity
O(n^2) is sometimes viewed as the worst/slowest possibility, but it's not. How many other slower possibilities are there? What are they? What is O(n^2) known as?
Two O(2^n) and O(n!) are both worse O(n^2) is *quadratic* time complexity
A non-programming way to think about O(log n) time complexity is looking up people in a ____________ ____________.
phone book (You don't need to check *every* person in the phone book to find the right one; instead, you can simply divide-and-conquer by looking based on where their name is alphabetically, and in every section you only need to explore a subset of each section before you eventually find someone's phone number. Of course, a bigger phone book will still take you a longer time, but it won't grow as quickly as the proportional increase in the additional size.)
With O(n) algorithms (linear time complexity), the time to execute is directly ________________ to the input size n.
proportional