← Back to Blog

Solving Recurrence Relations

Mastering the techniques behind analyzing divide-and-conquer algorithms — from substitution to the Master Theorem

Solving Recurrence Relations

Many of the most powerful algorithms in computer science — especially divide-and-conquer algorithms — are naturally described using recurrence relations.

If you understand how to solve recurrences, you can determine the time complexity of algorithms like:

  • Merge Sort
  • Quick Sort
  • Binary Search
  • Strassen's Matrix Multiplication
  • Many Dynamic Programming algorithms

This article explains what recurrence relations are and how to solve them using different techniques.

🔍 What Is a Recurrence Relation?

A recurrence relation is an equation that defines a function in terms of itself. In algorithm analysis, we use recurrences to describe running time.

Example:

T(n) = T(n − 1) + 1

This means:

  • Solving a problem of size n
  • Requires solving a smaller problem of size n − 1
  • Plus some extra work

Example: Recursive Sum

def sum_array(arr, n):
    if n == 0:
        return 0
    return sum_array(arr, n-1) + arr[n-1]

Running time recurrence:

T(n) = T(n − 1) + c

📈 Why Recurrence Relations Matter

Most recursive algorithms do one of these:

  • Reduce problem size by 1
  • Divide problem into equal parts
  • Divide problem into unequal parts

Each leads to a different recurrence. Understanding recurrences helps us:

  • Predict algorithm efficiency
  • Compare divide-and-conquer strategies
  • Optimize recursive designs

🛠️ Methods to Solve Recurrence Relations

There are three major techniques:

  1. Substitution Method
  2. Recursion Tree Method
  3. Master Theorem

Let's go through them one by one.

📝 Method 1: Substitution Method

This method involves:

  1. Guessing the solution
  2. Proving it using induction

Example 1

T(n) = T(n − 1) + 1

Expand:

T(n) = T(n − 2) + 2
T(n) = T(n − 3) + 3

Continuing:

T(n) = T(1) + (n − 1)

So:

T(n) = O(n)

Example 2

T(n) = 2T(n/2) + 1

Let's guess:

T(n) = O(n)

We can verify by substitution.

🌳 Method 2: Recursion Tree Method

This method visualizes recursion as a tree.

Example: Merge Sort

Recurrence:

T(n) = 2T(n/2) + n

Step 1: Expand levels.

Level Subproblems Cost per Level
0 1 × n n
1 2 × n/2 n
2 4 × n/4 n
... ... n

Each level costs n.

Height of tree:

log n

Total cost:

n · log n

Final Answer:

T(n) = O(n log n)

⚡ Method 3: Master Theorem

The most powerful tool for divide-and-conquer recurrences. It applies to recurrences of the form:

T(n) = aT(n/b) + f(n)

Where:

  • a = number of subproblems
  • b = division factor
  • f(n) = extra work per level

Compare f(n) with nlogba:

Case 1: f(n) is polynomially smaller

If:

f(n) = O(nlogba − ε)

Then:

T(n) = O(nlogba)

Case 2: f(n) is equal

If:

f(n) = Θ(nlogba)

Then:

T(n) = O(nlogba · log n)

Case 3: f(n) is polynomially larger

If:

f(n) = Ω(nlogba + ε)

Then:

T(n) = O(f(n))

(With regularity condition)

🧪 Master Theorem Examples

Example 1: Merge Sort

T(n) = 2T(n/2) + n

Here: a = 2, b = 2, f(n) = n

Compute:

nlog22 = n

Since f(n) = n, this is Case 2.

T(n) = O(n log n)

Example 2: Binary Search

T(n) = T(n/2) + 1

Here: a = 1, b = 2, f(n) = 1

Compute:

nlog21 = n0 = 1

This is Case 2.

T(n) = O(log n)

Example 3: Strassen's Algorithm

T(n) = 7T(n/2) + n²

Compute:

nlog27 ≈ n2.81

Since n² < n2.81, Case 1 applies.

T(n) = O(n2.81)

🔄 Iteration Method (Another Useful Technique)

You repeatedly substitute until reaching the base case.

Example

T(n) = 3T(n/3) + n

Expand:

= 3[3T(n/9) + n/3] + n = 9T(n/9) + 2n

Continue — after k steps:

= 3k · T(n/3k) + k · n

Stop when:

n/3k = 1 → k = log₃ n

Final:

T(n) = n · T(1) + n · log n = O(n log n)

📊 Common Recurrence Patterns

Recurrence Time Complexity
T(n) = T(n−1) + 1 O(n)
T(n) = T(n−1) + n O(n²)
T(n) = 2T(n/2) + n O(n log n)
T(n) = 2T(n/2) + n² O(n²)
T(n) = T(n/2) + 1 O(log n)
💡 Pro Tip

Memorize these common patterns — they appear frequently in interviews and competitive programming.

⚠️ When Master Theorem Does NOT Apply

Master Theorem only works when:

T(n) = aT(n/b) + f(n)

It does NOT apply if:

  • Subproblem sizes are unequal
  • Division is not constant
  • Recurrence is unusual

Example:

T(n) = T(n − 1) + T(n − 2)
⚠️ Important

This Fibonacci-style recurrence requires the characteristic equation method, not the Master Theorem.

🌐 Real-World Importance

Recurrence solving is used in:

  • Algorithm analysis — determining time complexity of recursive solutions
  • Compiler optimization — understanding recursive code transformations
  • Parallel computing — analyzing divide-and-conquer parallelism
  • Machine learning algorithm analysis — recursive model training
  • Systems performance modeling — predicting scalability
💡 Key Insight

Without solving recurrences, we cannot understand the scalability of recursive algorithms.

📝 Final Summary

Recurrence relations describe recursive algorithm performance. We solve them using:

  1. Substitution Method — guess and prove by induction
  2. Recursion Tree Method — visualize as a tree and sum levels
  3. Master Theorem — direct formula for divide-and-conquer
  4. Iteration Method — expand until base case

The key skill is recognizing the pattern and choosing the correct method.

Mastering recurrence relations is essential for:

  • Competitive programming
  • Technical interviews
  • Algorithm research
  • Advanced computer science