Big O Notation: An Interactive Guide

Building intuition to recognize time complexity from code patterns.

The Visualizer

The visualizer consists of three components:

  1. Code Editor: JavaScript that executes across a range of input sizes.
  2. Graph Chart: Displays operation counts as n increases.
  3. Control Panel: Controls the range of inputs via slider.
Big O Visualizer Components

Use op() markers to count operations.

As n increases, the visualizer tracks how many times each marker is called.

Loading editor...
n = 10

Additional features:

  • Pass values to specify growth rates, e.g., op(n) for linear or op(n*Math.log2(n)) for linearithmic.
  • Use plot() to overlay theoretical curves, e.g., plot(n*n) for quadratic comparison.
  • Add op.scope() before inner loops to isolate their contribution.

Common Complexities

O(1) - Constant

Operations that execute in fixed time regardless of input size. Example: Array element access.

Loading editor...
n = 10

O(log n) - Logarithmic

Occurs when the problem size is halved at each step. Logarithmic growth is remarkably slow—log₂(1,000,000) is approximately 20. Example: Binary search.

Loading editor...
n = 40

O(n) - Linear

One complete pass through the input. Often represents the lower bound for problems requiring examination of all data. Example: Finding the maximum value in an array.

Loading editor...
n = 10

O(n log n) - Linearithmic

The characteristic complexity of efficient divide-and-conquer algorithms. Marginally worse than linear, but substantially better than quadratic. Example: Merge sort, quicksort.

Loading editor...
n = 40

O(n²) - Quadratic

Results from nested iteration over the input. Manageable for small datasets, but performance degrades rapidly beyond a few thousand items. Example: Bubble sort, naive matrix multiplication.

Loading editor...
n = 10

O(2ⁿ) - Exponential

Each increment of input size doubles the total work. Visualize it as a binary tree that expands level by level.

The code mirrors this structure—each recursive call spawns two more calls. Example: Naive recursive Fibonacci, subset enumeration

Loading editor...
n = 10

O(n!) - Factorial

Each increment of input size multiplies the total work. The recursion tree branches from 1 up to n at each level, leading to extremely rapid growth. Even for n ≈ 60, the total number of operations exceeds the number of atoms in the universe.

The code reflects this branching pattern—each call spawns a number of recursive calls equal to the current n. Example: generating all permutations naively.

Loading editor...
n = 6

Three Core Principles

1. Constants Don't Matter

C · O(f) = O(f). Running the same loop twice still yields O(n). Big O cares about the shape of growth, not the exact numbers.

Loading editor...
n = 10

2. Largest Term Dominates

O(f) + O(g) = O(max(f, g)). When combining operations of different complexities, the fastest-growing term determines the overall behavior. The slower-growing terms become negligible as n increases.

Loading editor...
n = 10

3. Nested Operations Multiply

O(f) · O(g) = O(f · g). When one operation is nested inside another, their complexities multiply. The inner work executes once for each iteration of the outer operation.

Loading editor...
n = 10

The Master Theorem

For recursive divide-and-conquer algorithms: T(n) = aT(n/b) + O(nᵈ)

T(n) is runtime for input size n.

  • a - number of subproblems
  • b - division factor
  • d - exponent of work done outside recursion

The theorem determines which grows faster: the recursive splitting (a) or the work performed at each level (nᵈ).

  • If a = bᵈ: O(nᵈ log n) (Balanced)
  • If a < bᵈ: O(nᵈ) (External work dominates)
  • If a > bᵈ: O(nlogba) (Recursion dominates)

Merge Sort: The Balanced Case

a=2, b=2, d=1 → Since 2 = 2¹, the result is O(n log n)

Loading editor...
n = 50

Binary Search: Constant Work Case

a=1, b=2, d=0 → Single recursive call with constant work yields O(log n)

Loading editor...
n = 50

Matrix Multiplication: Recursion Dominates

Strassen's algorithm (a=7, b=2, d=2) reduces subproblems from 8 to 7, achieving O(n2.81) instead of the naive O(n3).

Loading editor...
n = 50

Practical Example: Merge Sort

Different markers track distinct aspects of the algorithm's behavior:

  • op() - Counts total recursive calls across all levels (O(n))
  • op1() - Tracks merge operations within each recursion level (O(log n), scoped)
  • op2() - Measures cumulative merge work across the entire algorithm (O(n log n))
Loading editor...
n = 10

Comparative View

All major complexity classes on a single chart. Note how exponential and factorial growth rapidly overwhelms the others.

Loading editor...
n = 6

Key Points

  • Big O characterizes growth patterns, not precise runtime measurements
  • O(n log n) represents the lower bound for comparison-based sorting
  • Exponential and factorial complexities remain practical only for small inputs (typically n < 20)
  • The Master Theorem provides a systematic way to analyze divide-and-conquer algorithms
  • The visualizer can be useful for building intuition about algorithmic behavior