###### Books / Analysis of Algorithms / Chapter 6

# Master Theorem

Since divide and conquer algorithms occur quite frequently and produce recursive equations for their run times, it is important to be able to solve recursive equations. Last lecture we “solved” the recurrence equation for merge sort using a method known as a *recursion tree*. While this method can in theory always be used, it is often cumbersome. For problems that have a fixed number of identical sized recursive pieces there is a much simpler method known as the *master theorem* that can be used to solve certain recursive equations almost “by inspection”.

## Master Theorem

The master theorem can be employed to solve recursive equations of the form

T(n) = aT(n/b) + f(n)

where *a* ≥ 1, *b* > 1, and *f(n)* is *asymptotically positive*. Intuitively for divide and conquer algorithms, this equation represents dividing the problem up into *a* subproblems of size *n/b* with a combine time of *f(n)*. For example, for merge sort *a* = 2, *b* = 2, and *f(n)* = Θ(*n*). Note that floors and ceilings for *n/b* do not affect the asymptotic behavior or the results derived using the theorem.

If the recursion is in the form shown above, then the recurrence can be solved depending on one of three cases (which relate the dominance of the recursive term to the combine term):

### Case 1 (recursion dominates)

If

f(n) = O(n^{logba - ε})

for some **constant** ε > 0, then the solution to the recurrence is given by

T(n) = Θ(n^{logba})

In this case, *f(n)* is *polynomially bounded* above by *n ^{logba}* (which represents the run time of the recursive term), i.e. the recursive term dominates the run time.

### Case 2 (recursion and combine are “equal”)

If

f(n) = Θ(n^{logba})

then the solution to the recurrence is given by

T(n) = Θ(n^{logba}lg n)

In this case, *f(n)* is “equal” to *n ^{logba}*, i.e. neither term dominates thus the extra term to get the bound.

### Case 3 (combine dominates)

If

f(n) = Ω(n^{logba + ε})

for some **constant** ε > 0 **AND** if *a f(n/b)* ≤ *c f(n)* for some **constant** *c* < 1 and *sufficiently large n* (this additional constraint is known as the *regularity condition*), then the solution to the recurrence is given by

T(n) = Θ(f(n))

In this case, *f(n)* is *polynomially bounded* below by *n ^{logba}* (which represents the run time of the recursive term) with the additional regularity condition, i.e. the combine term dominates the run time.

Thus in all three cases it is important to compute the *recursive term run time* *n ^{logba}* and compare it

*asymptotically*to the

*combine term run time f(n)*to determine which case holds. If the recursive equation satifies either case 1 or case 2, the solution can then be written by inspection. If the recursive equation satisfies case 3, then the regularity condition must be verified in order to write down the solution.

Note: There are gaps between the cases where the theorem cannot be applied (as well as recursive equations that **do not** fit the form required by the theorem **exactly**). In these cases, other techniques must be used which will not be covered in this course (refer to chapter 7 for details).

## Procedure

To apply the master theorem, the following procedure can be applied. Given the recursive equation

T(n) = aT(n/b) + f(n)

- Convert any asymptotic bounds for
*f(n)*to functional notation, e.g.**Θ(n**^{2}) ⇒ cn^{2} - Identify
**a**,**b**, and**f(n)** - Compute
**n**^{logba} - Compare
**f(n)**with**n**to determine the proper case and write the solution to the recursive equation^{logba}**Case 1**(*f(n) “<” n*)^{logba}**:**Give an ε to show**f(n) = O(n**, then give the solution^{logba - ε})**T(n) = Θ(n**^{logba})**Case 2**(*f(n) “≈” n*)^{logba}**:**Show**f(n) = Θ(n**, then give the solution^{logba})**T(n) = Θ(n**^{logba}lg n)**Case 3**(*f(n) “>” n*)^{logba}**:**Give an ε to show**f(n) = Ω(n**,^{logba + ε})**AND**show**af(n/b) ≤ cf(n)**for some*c < 1*, then give the solution**T(n) = Θ(f(n))**

## Examples

### Example 1

Solve the recursive equation

T(n) = 9T(n/3) + n

1.f(n) = ndoes not contain any asymptotic bounds that need conversion

2.For this equation

a = 9

b = 3

f(n) = nIntuitively this equation would represent an algorithm that divides the original inputs into nine groups each consisting of a third of the elements and takes linear time to combine the results.

3.Computing

n^{logba}= n^{log39}= n^{2}

4.We see thatf(n) = n “<” nso we try to show^{2}Case 1, i.e.

f(n) = n = O(n^{logba - ε}) = O(n^{2 - ε})which is satisfied for any ε ≤ 1, e.g. choose ε = 1 giving

n = O(n. Hence the equation satisfies^{2 - 1}) = O(n)Case 1so we can give the solution as

T(n) = Θ(n^{logba}) = Θ(n^{2})

### Example 2

Solve the recursive equation

T(n) = T(2n/3) + Θ(1)

1.f(n) = Θ(1)hence we convert the asymptotic bound tof(n) = c

2.For this equation

a = 1

b = 3/2(notebis thedenominatorofn/(3/2))

f(n) = cIntuitively this equation would represent an algorithm that only uses two thirds of the elements at each step and takes constant time to combine the results.

3.Computing

n^{logba}= n^{log3/21}= n^{0}= 1

4.We see thatf(n) = c “≈” 1so we try to showCase 2, i.e.

f(n) = c = Θ(n^{logba}) = Θ(1)which is satisfied. Hence the equation satisfies

Case 2so we can give the solution as

T(n) = Θ(n^{logba}lg n) = Θ(1 lg n) = Θ(lg n)

### Example 3

Solve the recursive equation

T(n) = 3T(n/4) + 2nlgn

1.f(n) = 2nlgndoes not contain any asymptotic bounds that need conversion

2.For this equation

a = 3

b = 4

f(n) = 2nlgnIntuitively this equation would represent an algorithm that divides the original inputs into three groups each consisting of a quarter of the elements and takes linearlog time to combine the results.

3.Computing

n^{logba}= n^{log43}≈ n^{0.79}

4.We see thatf(n) = 2nlgn “>” nso we try to show^{0.79}Case 3, i.e.

f(n) = 2nlgn = Ω(n^{logba + ε}) = Ω(n^{0.79 + ε})which (since

2nlgn = Ω(n)) is satisfied for any ε ≤ 0.21, e.g. choose ε = 0.2 giving2nlgn = Ω(n. Hence the equation^{0.79 + 0.2}) = Ω(n^{0.99})mightsatisfiesCase 3if we can show the regularity condition

af(n/b) ≤ cf(n)forc < 1Thus

a f(n/b) = 3·2(n/4)lg(n/4) = (3/4)·2nlg(n/4) ≤ (3/4)·2nlgn = (3/4)f(n)Thus regularity holds by choosing

c = 3/4(which is < 1). So we can give the solution as

T(n) = Θ(f(n)) = Θ(2nlgn) = Θ(nlgn)

### Example 4

Solve the recursive equation

T(n) = 2T(n/2) + nlgn

1.f(n) = nlgndoes not contain any asymptotic bounds that need conversion

2.For this equation

a = 2

b = 2

f(n) = nlgnIntuitively this equation would represent an algorithm that divides the original inputs into two groups each consisting of half of the elements and takes linearlog time to combine the results.

3.Computing

n^{logba}= n^{log22}= n^{1}= n

4.We see thatf(n) = nlgn “>” nso we try to showCase 3, i.e.

f(n) = nlgn = Ω(n^{logba + ε}) = Ω(n^{1 + ε})which while

n lg n≥nasymptotically, it is notpolynomially greater- i.e. there is no ε that satisfies the above equation, thereforeCase 3does not apply(and clearlynlgn ≠ Θ(n)so we are not in Case 2).However, we can use a

generalizationofCase 2which states that if

f(n) = Θ(nfor^{logba}lg^{k}n)k ≥ 0(note for the standardCase 2⇒k = 0)then the solution to the recurrence is given by

T(n) = Θ(n^{logba}lg^{k+1}n)Hence in this example

f(n) = nlgn = Θ(nfor^{logba}lg^{k}n) = Θ(nlgn)k = 1so we can give the solution as

T(n) = Θ(n^{logba}lg^{k+1}n) = Θ(n lg^{2}n)