recurrence equations(recurrence relations)
A recurrence equation expresses the overall cost or value of a function as a combination of:
- One or more smaller subproblems (usually recursive calls).
- Some additional work done at that level. otherwise, known as calculating the time complexity
Recurrence equations help us:
- Analyze runtime of recursive algorithms.
- Understand performance bottlenecks.
- Compare different algorithmic approaches.
Examples
Algorithm | Recurrence Equation | Time Complexity |
---|---|---|
Binary Search | ||
Merge Sort | ||
Fibonacci (naive) |
Solving Recurrence Equations
Iterative method
Keep expanding the recurrence step by step until you reach the base case. Then, observe a pattern and simplify.
Example: T(n) = T(n-1) + c
This could come from something like the FACT(n)
function.
T(n) = T(n-1) + c
= (T(n-2) + c) + c
= T(n-2) + 2c
= T(n-3) + 3c
...
= T(1) + (n - 1)c
if , then:
Recursion tree method
drawing a recursion tree(with running times), summing the cost of running all levels.
- Sometimes it can be tricky to use this method as the recursion tree can be degenerate
Level 0: cn
Level 1: cn/2 + cn/2 = cn
Level 2: cn/4 + cn/4 + cn/4 + cn/4 = cn
...
log n levels β each does cn
Master theorem
If your recurrence fits a certain divide-and-conquer form, you can just plug in values.
- a: number of subproblems
- b: size shrink factor
- f(n): extra work done outside the recursion
Master Theorem Cases:
Let f(n)=Ξ(nc)f(n) = \Theta(n^c)f(n)=Ξ(nc), compare c with logβ‘ba\log_b alogbβa:
Case | Condition | Result |
---|---|---|
1 | ||
2 | ||
3 |
Example: Merge Sort
- a=2, b=2, f(n)=n, so c=1
Case 2 β