Master Theorem - rFronteddu/general_wiki GitHub Wiki
Master Theorem, also known as Master Method, provides asymptotic analysis (i.e. the time complexity) for many of the recursion algorithms that follow the pattern of divide-and-conquer.
function dac( n ):
if n < k: // k: some constant
Solve "n" directly without recursion
else:
[1]. divide the problem "n" into "b" subproblems of equal size.
- then the size of each subproblem would be "n/b"
[2]. call the function "dac()" recursively "a" times on the subproblems
[3]. combine the results from the subproblems
If we define the time complexity of the above recursion algorithm as T(n), then we can express it as follows:
$T(n)=a * T(\frac{n}{b}) + f(n)
$
- f(n) is the time complexity that it takes to divide the problem into subproblems and to combine the results from the sub problems
- f(n) can be represented as O($
n^d
$ with d >= 0
Then, the Master Theorem provides us three formulas to calculate the time complexity of the recursion algorithm, according to the relationship among a,b,d. They are stated as follows:
$$ \text{If } f(n) = O(n^{\log_b a - \epsilon}), \text{ then } T(n) = \Theta(n^{\log_b a}). $$
$$ \text{If } f(n) = \Theta(n^{\log_b a}), \text{ then } T(n) = \Theta(n^{\log_b a} \log n). $$
$$ \text{If } f(n) = \Omega(n^{\log_b a + \epsilon}) \text{ and } af\left(\frac{n}{b}\right) \leq kf(n) \text{ for some constant } k < 1 \text{ and sufficiently large } n, $$
$$ \text{then } T(n) = \Theta(f(n)). $$
- The conditions for each case correspond to the intuition of whether the work to split problems and combine results, for example, in case 1 the work to split the problems and combine results is dwarfed by the subproblems, therefore the time complexity of the overall recursion algorithm can be reduced to the complexity of the problems (i.e. a*T(...))