Time & Space Complexity - marzoukali/cs-notes GitHub Wiki
Some notes:
Time complexity notes:
- We always in computer science assuming that log of base 2 while in math the default is base 10.
- log 16 = 4 means that it takes from us 4 steps to reach from 0 to 16.
- When we halving our problem like in divide and conquer problems, so we are going by log n.
- O(N log N) means we do log n for each n element we have like in the merge and quicksort (on it's best case not it's worst).
- Recursive and backtrack problems are always with time complexity O(2^n).
- When calculating things like permutations we always going with O(n!).
Space complexity notes:
- Always recursion takes O(n) space.
Improvements:
- It's trade-offs so increasing the space will affect the decrease the time and vice versa.
Examples:
- linear time:
function firstTen(arr) {
let i = 0;
while(i < 10 && i < arr.length) {
console.log(arr[10]);
i++;
}
}
Printing the first 10 items in an array O(1).
-
For log times and nlogn, we should carefully analysis to check patterns like in binary search, merge, quick (on it's best not worst ) or heap sort.
-
The idea is that an algorithm is O(log n) if instead of scrolling through a structure 1 by 1, you divide the structure in half over and over again and do a constant number of operations for each split. Search algorithms where the answer space keeps getting split are O(log n). An example of this is binary search, where you keep splitting an ordered array in half over and over again until you find the number. Note: You don't necessarily need to be splitting in even halves.
-
Example of O(n) Vs O(logn):
// O(n)
function fxn($n) {
for ($i = 0; $i <= $n; $i++)
echo $i;
}
// O(logn)
function fxn($n) {
for ($i = 1; $i <= $n; $i *= 2)
echo $i;
}
- Cheatsheet: