Sorting
Sorting: Arranging Data in Order
Sorting is the process of arranging items in a sequence. It's a fundamental computer science problem and a perfect case study for algorithm design and analysis. Understanding different sorting algorithms reveals trade-offs between time complexity, space complexity, and stability.
Comparison-Based Sorting Algorithms
These algorithms determine the sorted order by comparing pairs of elements.
1. Bubble Sort
The simplest sorting algorithm. It repeatedly steps through the list, compares adjacent elements, and "bubbles" the largest elements to the end. These passes continue until the list is sorted.
Time: O(n²) | Space: O(1) | Stable: Yes
2. Selection Sort
This algorithm divides the list into a sorted and an unsorted part. It repeatedly finds the minimum element from the unsorted part and swaps it into its correct position at the end of the sorted part.
Time: O(n²) | Space: O(1) | Stable: No
3. Insertion Sort
Builds the final sorted array one item at a time. It iterates through the input elements and inserts each element into its correct position in the already-sorted part of the array.
Time: O(n²) worst-case, O(n) best-case | Space: O(1) | Stable: Yes
4. Merge Sort
A highly efficient, stable, "divide and conquer" algorithm. It recursively divides the list into halves until each sublist contains a single element, then merges those sublists back together in sorted order.
[4, 2, 7, 1] -> [4, 2] | [7, 1] -> [4] [2] | [7] [1] -> [2, 4] | [1, 7] -> [1, 2, 4, 7]
Time: O(n log n) | Space: O(n) | Stable: Yes
5. Quick Sort
Another "divide and conquer" algorithm. It picks a 'pivot' element and partitions the other elements into two sub-arrays, according to whether they are less than or greater than the pivot. The sub-arrays are then sorted recursively. It's often faster in practice than Merge Sort due to better cache performance, but it is not stable and its worst-case performance is O(n²).
Time: O(n log n) average, O(n²) worst-case | Space: O(log n) | Stable: No
Non-Comparison and Hybrid Sorts
Some algorithms sort without direct comparisons, or combine methods for optimal performance.
- Counting Sort: Works by counting the number of objects having distinct key values. Efficient for integer keys within a specific range.
- Radix Sort: Sorts integers by processing individual digits.
- Timsort & Introsort: Hybrid algorithms used in Python (Timsort) and C++ (Introsort) standard libraries. They combine the best aspects of multiple sorting methods to achieve excellent real-world performance.
Classic Problems
Sort Colors (Dutch National Flag problem)
Given an array with n objects colored red, white, or blue (represented by 0, 1, 2), sort them in-place. This is a classic single-pass O(n) problem solvable with three pointers.
from typing import List
class Solution:
def sortColors(self, nums: List[int]) -> None:
"""
Do not return anything, modify nums in-place instead.
"""
low, mid, high = 0, 0, len(nums) - 1
while mid <= high:
if nums[mid] == 0:
nums[low], nums[mid] = nums[mid], nums[low]
low += 1
mid += 1
elif nums[mid] == 1:
mid += 1
else: # nums[mid] == 2
nums[mid], nums[high] = nums[high], nums[mid]
high -= 1
Kth Largest Element in an Array
Find the kth largest element. The most straightforward way is to sort the array first, which takes O(n log n) time. After sorting, the kth largest element will be at index `n - k`.
from typing import List
import collections
class Solution:
def findKthLargest(self, nums: List[int], k: int) -> int:
# A simple solution is to sort the array and pick the element.
# Sorting takes O(n log n) time.
nums.sort()
# The kth largest element is at the (n-k)th index.
return nums[len(nums) - k]
Visualize the Algorithms
The best way to understand these algorithms is to see them run. Use the interactive simulation below to watch how each one operates on a sample dataset.
Bubble Sort
Watch sorting algorithms in action. Select an algorithm and press the reset button to generate a new array.