Asymptotic Notation

Data Structures & Algorithms

Table of Contents


Introduction

Asymptotic notation provides a mathematical framework for analyzing algorithm efficiency without being tied to specific hardware or implementation details. Instead of measuring exact execution time (which varies between computers), we analyze how an algorithm’s resource requirements grow relative to input size (n).

Key Benefits:

  • Hardware-independent performance analysis
  • Enables algorithm comparison and selection
  • Predicts scalability behavior
  • Focuses on growth rate rather than exact values

Primary Use Cases:

  • Time Complexity: How execution time grows with input size
  • Space Complexity: How memory usage grows with input size

Note: We typically focus on worst-case scenarios (Big O) because they provide safety guarantees for performance-critical applications.


Mathematical Foundations

Logarithms - The “Halving” Function

log₂(n) answers: “How many times do I divide by 2 to reach 1?”

Example: log₂(8) = 3 because 8 → 4 → 2 → 1 (3 steps)

Why it matters: Binary search on 1 million items needs only ~20 comparisons (log₂(1,000,000) ≈ 20)

Growth Patterns

Polynomial (n^k): Predictable growth

  • n² at n=10: 100 operations
  • n² at n=100: 10,000 operations

Exponential (2^n): Explosive growth

  • 2^10 = 1,024 operations
  • 2^20 = 1,048,576 operations (1000x more for just 2x input!)

Types of Asymptotic Notation

Big O (O) - Upper Bound (Worst Case)

“At worst, it will take this long”

Like a speed limit - algorithm never slower than this bound.

Usage: Safety guarantees, most common in practice

Big Omega (Ω) - Lower Bound (Best Case)

“At best, it will take at least this long”

Usage: Proving theoretical limits

Big Theta (Θ) - Tight Bound (Exact)

“It will always take roughly this long”

When Big O = Big Omega, you have Big Theta (consistent performance)


Common Time Complexities

Ordered from Best to Worst Performance:

Notation Name Growth Rate Example Algorithms
O(1) Constant No growth Array access, hash table lookup
O(log n) Logarithmic Very slow growth Binary search, balanced tree operations
O(n) Linear Proportional growth Linear search, single loop
O(n log n) Linearithmic Moderate growth Merge sort, heap sort
O(n²) Quadratic Rapid growth Bubble sort, nested loops
O(n³) Cubic Very rapid growth Matrix multiplication (naive)
O(2ⁿ) Exponential Explosive growth Recursive Fibonacci, subset generation
O(n!) Factorial Extremely explosive Traveling salesman (brute force)

Scalability Comparison (think of it like cooking for different sized parties):

  • n = 10 (small dinner party): All methods work fine
  • n = 1,000 (wedding reception): O(n²) starts to hurt, exponential becomes impossible
  • n = 1,000,000 (city festival): Only O(1), O(log n), O(n), O(n log n) are practical

Real-world analogy:

  • O(1): Turning on a light switch (same effort regardless of room size)
  • O(log n): Finding a word in a dictionary (flip to middle, eliminate half, repeat)
  • O(n): Counting people in a line (must count each person once)
  • O(n²): Shaking hands with everyone at a party (each person shakes hands with every other person)

Detailed Examples

Constant Time - O(1)

// Array access - always one operation
array[0];

// Hash table lookup - direct access
dictionary[key];

// Stack/Queue operations
stack.Push(5);    // O(1)
stack.Pop();      // O(1)

Logarithmic Time - O(log n)

// Binary search - halves search space each iteration
public static int BinarySearch(int[] arr, int target)
{
    int left = 0, right = arr.Length - 1;

    while (left <= right)
    {
        int mid = left + (right - left) / 2;
        if (arr[mid] == target) return mid;
        if (arr[mid] < target) left = mid + 1;
        else right = mid - 1;
    }
    return -1;
}

Linear Time - O(n)

// Single pass through data
public static int FindMax(int[] arr)
{
    int max = arr[0];
    foreach (int num in arr)  // Visit each once
        if (num > max) max = num;
    return max;
}

Linearithmic Time - O(n log n)

// Merge sort - log n levels, each doing O(n) work
public static void MergeSort(int[] arr, int left, int right)
{
    if (left < right)
    {
        int mid = (left + right) / 2;
        MergeSort(arr, left, mid);      // Divide
        MergeSort(arr, mid + 1, right);
        Merge(arr, left, mid, right);   // O(n) merge
    }
}

Quadratic Time - O(n²)

// Nested loops over same data
for (int i = 0; i < n; i++)
    for (int j = i + 1; j < n; j++)
        // Compare arr[i] and arr[j]

// Better with HashSet - O(n)
var seen = new HashSet<int>();
foreach (int num in arr)
    if (!seen.Add(num)) return true;  // Has duplicates

Exponential Time - O(2ⁿ)

// Naive Fibonacci - each call spawns 2 more
public static int Fib(int n)
{
    if (n <= 1) return n;
    return Fib(n - 1) + Fib(n - 2);  // Exponential branching
}

// Fix with memoization - O(n)
var memo = new Dictionary<int, int>();
// Store and reuse results

Factorial Time - O(n!)

// Generate all permutations - n! possibilities
public static List<List<int>> Permute(int[] nums)
{
    // For [1,2,3]: returns 3! = 6 permutations
    // Typically needs backtracking
}

Space Complexity

Space Pattern Example
O(1) Fixed variables int max = arr[0];
O(n) New array/list int[] copy = new int[n];
O(n) Recursive depth n recursive calls on stack
O(n²) 2D matrix int[,] matrix = new int[n,n];
// O(1) - few variables only
int max = arr[0];
for (int i = 1; i < n; i++)
    if (arr[i] > max) max = arr[i];

// O(n) - recursive call stack
int Factorial(int n) {
    if (n <= 1) return 1;
    return n * Factorial(n - 1);  // n calls = O(n) space
}

Practical Analysis Tips

1. Count Loops:

  • Single loop → O(n)
  • Nested loops (same size) → O(n²)
  • Loop that halves → O(log n)

2. Recursive Pattern:

  • Each call makes 2 calls → O(2ⁿ)
  • Divide problem in half → O(log n) levels
  • Divide + linear work → O(n log n)

3. Drop Constants:

  • 3n + 5 → O(n)
  • n²/2 + n → O(n²)
  • Keep highest-order term only

Quick Reference

Complexity Hierarchy

O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(2ⁿ) < O(n!)

Practical Input Limits

| Complexity | Max Practical n | Example | |————|—————–|———| | O(1), O(log n) | Any size | Array access, binary search | | O(n) | ~10⁸ | Linear scan | | O(n log n) | ~10⁶ | Merge sort | | O(n²) | ~10⁴ | Bubble sort | | O(2ⁿ) | ~20 | Subsets | | O(n!) | ~10 | Permutations |

Common Algorithm Complexities

| Operation | Time | Space | Notes | |———–|——|——-|——-| | Binary Search | O(log n) | O(1) | Sorted array required | | Merge Sort | O(n log n) | O(n) | Stable, guaranteed | | Hash Lookup | O(1) avg | O(n) | Worst case O(n) | | DFS/BFS | O(V + E) | O(V) | Graph traversal |


Found this guide helpful? Share it with your team:

Share on LinkedIn