Time Complexity
Time Complexity and Big O Notation
Time complexity is a way to analyze how the runtime of an algorithm grows as the size of the input grows. It's a critical concept for writing efficient and scalable code, especially in competitive programming where solutions are judged against time limits.
Big O Notation
We use Big O notation to classify algorithms according to how their run time or space requirements grow as the input size (n) grows. Big O describes the worst-case scenario, providing an upper bound on the growth rate. When analyzing algorithms, we are interested in the order of growth, so we drop constant factors and non-dominant terms.
O(2n)
becomes O(n)O(n² + n)
becomes O(n²)O(n + log n)
becomes O(n)
Execution Time Limits
In a typical competitive programming environment, a computer can perform around 10⁸ operations per second. This is a crucial rule of thumb. When you know the constraints on the input size 'n', you can estimate the maximum allowable time complexity.
Input Size (n) | Required Complexity |
---|---|
10 - 20 | O(n!) or O(2ⁿ) |
≤ 500 | O(n³) |
≤ 5000 | O(n²) |
≤ 10⁵ - 10⁶ | O(n log n) or O(n) |
≥ 10⁷ | O(log n) or O(1) |
For example, if the input size is n = 10⁵
, an O(n²)
algorithm would take roughly (10⁵)² = 10¹⁰ operations. This is far too slow (approx. 100 seconds). However, an O(n log n)
algorithm would take about 10⁵ * log₂(10⁵) ≈ 10⁵ * 17 operations, which is well within the time limit.
Common Time Complexities
Here are some common complexities, ordered from fastest to slowest.
- O(1) - Constant Time: The runtime doesn't change with the input size. Examples: accessing an array element by index, pushing to a stack.
- O(log n) - Logarithmic Time: The runtime grows logarithmically. The algorithm cuts the problem size by a constant fraction with each step. Example: Binary Search.
- O(n) - Linear Time: The runtime grows linearly with the input size. Example: iterating through an array or linked list once.
- O(n log n) - Log-Linear Time: This is a very common complexity for efficient sorting algorithms. Examples: Merge Sort, Quick Sort (average case).
- O(n²) - Quadratic Time: The runtime grows quadratically. Often involves nested loops over the input. Examples: Bubble Sort, Insertion Sort, traversing a 2D matrix naively.
- O(2ⁿ) - Exponential Time: The runtime doubles with each addition to the input data set. Often seen in recursive algorithms that generate all subsets of a set.
- O(n!) - Factorial Time: The runtime grows factorially. This is very slow and usually found in algorithms that try every permutation of the input, like a brute-force solution to the Traveling Salesman Problem.