Chapter 2: Complexity Analysis
Big O Notation
Big O notation is a mathematical way to describe the limiting behavior of a function when the argument tends towards a particular value or infinity. In computer science, it's used to describe the worst-case time or space complexity of an algorithm.
Big O notation provides an upper bound on the growth rate of an algorithm's resource requirements. It helps us understand how an algorithm's performance scales with input size, allowing us to compare different algorithms and make informed decisions about which one to use.
Common Big O Complexities
| Notation | Name | Example |
|---|---|---|
| O(1) | Constant | Array access, hash table lookup |
| O(log n) | Logarithmic | Binary search |
| O(n) | Linear | Linear search, iterating through array |
| O(n log n) | Linearithmic | Merge sort, heap sort |
| O(n²) | Quadratic | Bubble sort, nested loops |
| O(2ⁿ) | Exponential | Recursive Fibonacci (naive) |
| O(n!) | Factorial | Generating all permutations |
Time Complexity
Time complexity describes how the runtime of an algorithm increases as the input size grows. It's one of the most important metrics for evaluating algorithm efficiency.
Analyzing Time Complexity
To analyze time complexity, count the number of basic operations performed:
# Example: Linear Search
def linear_search(arr, target):
for i in range(len(arr)): # O(n) iterations
if arr[i] == target: # O(1) operation
return i # O(1) operation
return -1 # O(1) operation
# Overall: O(n) time complexity
Best, Average, and Worst Case
- Best Case (Ω): Minimum time required (e.g., finding target at first position)
- Average Case (Θ): Expected time for random input
- Worst Case (O): Maximum time required (e.g., target not found or at last position)
Space Complexity
Space complexity describes how much memory an algorithm uses relative to the input size. This includes both auxiliary space (extra space) and input space.
Space Complexity Examples
# O(1) space - constant space
def find_max(arr):
max_val = arr[0] # O(1) space
for num in arr: # O(1) space for loop variable
if num > max_val:
max_val = num
return max_val
# O(n) space - linear space
def copy_array(arr):
result = [] # O(n) space
for num in arr:
result.append(num)
return result
# O(log n) space - recursive call stack
def binary_search_recursive(arr, target, left, right):
if left > right:
return -1
mid = (left + right) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
return binary_search_recursive(arr, target, mid + 1, right)
else:
return binary_search_recursive(arr, target, left, mid - 1)
Complexity Comparison
Understanding how different complexities compare helps in choosing the right algorithm:
For an input size of n = 1,000,000:
- O(1): 1 operation
- O(log n): ~20 operations
- O(n): 1,000,000 operations
- O(n log n): ~20,000,000 operations
- O(n²): 1,000,000,000,000 operations
As you can see, the difference between O(n) and O(n²) becomes enormous as input size grows!
Practical Tips for Complexity Analysis
- Focus on the dominant term: O(n² + n) = O(n²)
- Ignore constants: O(2n) = O(n)
- Consider nested loops: Nested loops often indicate O(n²) or higher
- Recursion depth matters: Each recursive call adds to space complexity
- Data structure choice: Different operations have different complexities
What's Next?
Now that you understand complexity analysis, you're ready to explore specific algorithms. Check out:
- Chapter 3: Recursion & Divide-and-Conquer - Understand recursive complexity
- Chapter 4: Trees - Analyze tree operation complexities
- Chapter 5: Sorting Algorithms - See complexity analysis in action
- Chapter 6: Searching Algorithms - Compare O(n) vs O(log n) search