Deep Dive into 5 Essential Sorting Algorithms

April 9, 2025 (5mo ago)

Detailed Report on Common Sorting Algorithms

Introduction

Sorting algorithms are fundamental computer science concepts used to arrange elements of a list or array in a specific order (typically numerical or lexicographical). Understanding different sorting algorithms is crucial for optimizing performance, as their efficiency varies significantly based on factors like input size, existing order of elements, and memory constraints. This report details five common sorting algorithms: Bubble Sort, Insertion Sort, Selection Sort, Quick Sort, and Merge Sort, exploring their mechanisms, implementations, and performance characteristics.


1. Bubble Sort

Concept/Mechanism

Bubble Sort is one of the simplest sorting algorithms. It works by repeatedly stepping through the list, comparing each pair of adjacent items, and swapping them if they are in the wrong order. This process is repeated until no swaps are needed, indicating the list is sorted. The algorithm gets its name because smaller or larger elements (depending on the sorting order) "bubble" up to their correct position in the list, similar to bubbles rising in water. In each pass through the list, the largest unsorted element is guaranteed to move to its final sorted position at the end of the unsorted portion.

Algorithm Steps (Code Walkthrough - bubbleSort)

The implementation uses two nested loops:

  1. Outer Loop (for let n = array.length; n >= 0; n--): This loop controls the number of passes and shrinks the considered portion of the array after each pass. Since the largest element "bubbles" to the end in each pass, the effective size of the array that needs sorting (n) decreases by one each time.
  2. Inner Loop (for let i = 0; i < n - 1; i++): This loop iterates through the currently unsorted portion of the array (from the start up to n-1).
  3. Comparison and Swap (if (array[i] > array[i + 1])): Inside the inner loop, it compares the current element (array[i]) with the next adjacent element (array[i + 1]). If the current element is greater than the next one, they are swapped. The swap is efficiently done using array destructuring: [array[i], array[i + 1]] = [array[i + 1], array[i]].
// bubble.ts - Core Logic
export function bubbleSort(array: number[]) {
  // Outer loop reduces the comparison range after each pass
  for (let n = array.length; n >= 0; n--) {
    // Inner loop compares adjacent elements
    for (let i = 0; i < n - 1; i++) {
      // Swap if current element is larger than the next
      if (array[i] > array[i + 1]) {
        ;[array[i], array[i + 1]] = [array[i + 1], array[i]]
      }
    }
  }
  return array
}

Time Complexity

The O(n²) complexity arises directly from the two nested loops, each iterating approximately n times (where n is the number of elements).

Space Complexity


2. Insertion Sort

Concept/Mechanism

Insertion Sort builds the final sorted array one item at a time. It iterates through the input array and for each element, it finds the correct position within the already sorted portion of the array (to the left of the current element) and inserts it there. This involves shifting larger elements one position to the right to make space for the inserted element. It's analogous to how one might sort a hand of playing cards.

Algorithm Steps (Code Walkthrough - insertionSort)

  1. Outer Loop (for let i = 1; i < array.length; i++): Iterates through the array starting from the second element (i=1), as the first element (i=0) is considered a sorted sublist of size one.
  2. Store Current Value (const value = array[i]): The element to be inserted is stored in value.
  3. Initialize j (let j = i - 1): j points to the last element in the already sorted sublist.
  4. Inner Loop (for (j; j >= 0; j--)): This loop moves backward through the sorted sublist (array[0...i-1]).
  5. Compare and Shift (if (array[j] > value)): If the element in the sorted sublist (array[j]) is greater than the value we want to insert, it's shifted one position to the right (array[j + 1] = array[j]).
  6. Break (else { break }): If an element array[j] is found that is less than or equal to value, it means the correct position for value has been found (just after j), so the inner loop breaks.
  7. Insert Value (array[j + 1] = value): After the inner loop finishes (either by reaching the beginning or breaking early), the value is placed in its correct position (j+1).
// insertion.ts - Core Logic
export function insertionSort(array: number[]) {
  // Iterate from the second element
  for (let i = 1; i < array.length; i++) {
    const value = array[i] // Element to insert
    let j = i - 1       // Index for sorted part
 
    // Move backwards through sorted part
    for (j; j >= 0; j--) {
      // Shift elements larger than 'value' to the right
      if (array[j] > value) {
        array[j + 1] = array[j]
      } else {
        // Found the correct spot, break
        break
      }
    }
    // Insert 'value' into its correct position
    array[j + 1] = value
  }
  return array
}

Time Complexity

Similar to Bubble Sort, the potential O(n²) complexity stems from the nested loop structure. Insertion Sort often performs better than Bubble Sort in practice (even with the same Big O average complexity) because comparisons and shifts might stop early if an element's position is found quickly.

Space Complexity


3. Selection Sort

Concept/Mechanism

Selection Sort divides the input list into two parts: a sorted sublist which is built up from left to right at the front (beginning) of the list, and a sublist of the remaining unsorted items that occupy the rest of the list. Initially, the sorted sublist is empty and the unsorted sublist is the entire input list. The algorithm proceeds by finding the smallest (or largest, depending on sorting order) element in the unsorted sublist, swapping it with the leftmost unsorted element (putting it in sorted order), and moving the sublist boundaries one element to the right.

Algorithm Steps (Code Walkthrough - selectionSort)

  1. Outer Loop (for let i = 0; i < array.length - 1; i++): Iterates from the first element up to the second-to-last element. i represents the boundary between the sorted (left) and unsorted (right) portions.
  2. Initialize minIndex (let minIndex = i): Assumes the first element of the unsorted portion (array[i]) is the minimum initially.
  3. Inner Loop (for let j = i + 1; j < array.length; j++): Iterates through the remaining unsorted portion of the array (from i+1 to the end).
  4. Find Minimum (if (array[j] < array[minIndex])): Compares the current element (array[j]) with the element at the current minIndex. If array[j] is smaller, minIndex is updated to j.
  5. Swap (if (minIndex !== i)): After the inner loop finishes, minIndex holds the index of the smallest element in the unsorted portion. If this smallest element isn't already at the beginning of the unsorted portion (i.e., minIndex !== i), it's swapped with the element at index i.
// selection.ts - Core Logic
export function selectionSort(array: number[]) {
  // Iterate through the array defining the sorted boundary
  for (let i = 0; i < array.length - 1; i++) {
    let minIndex = i // Assume current is minimum
 
    // Iterate through the unsorted part to find the actual minimum
    for (let j = i + 1; j < array.length; j++) {
      if (array[j] < array[minIndex]) {
        minIndex = j // Update minimum index
      }
    }
 
    // Swap the found minimum element with the first element of the unsorted part
    if (minIndex !== i) {
      ;[array[i], array[minIndex]] = [array[minIndex], array[i]]
    }
  }
  return array
}

Time Complexity

Selection Sort always performs O(n²) comparisons because the inner loop invariably searches the entire remaining unsorted portion to find the minimum, regardless of the input array's initial order. The number of swaps is O(n) in the worst case (one swap per outer loop iteration), which is better than Bubble Sort's worst-case swaps, but the comparison count dominates the complexity.

Space Complexity


4. Quick Sort

Concept/Mechanism

Quick Sort is an efficient, divide-and-conquer sorting algorithm. Its main steps are:

  1. Pick a Pivot: Choose an element from the array as the pivot.
  2. Partition: Rearrange the array so that all elements smaller than the pivot come before it, while all elements greater than the pivot come after it. After partitioning, the pivot is in its final sorted position.
  3. Recurse: Recursively apply the above steps to the sub-array of elements with smaller values and the sub-array of elements with greater values.

The base case for the recursion is when a sub-array has zero or one element, which is inherently sorted. The efficiency heavily depends on the choice of the pivot; ideally, the pivot should divide the array into two roughly equal halves.

Algorithm Steps (Code Walkthrough - quickSort)

This implementation uses the last element as the pivot and creates new arrays for partitions (less efficient in terms of space than in-place partitioning but simpler to understand conceptually).

  1. Base Case (if (array.length <= 1)): If the array has 0 or 1 element, it's already sorted, so return it.
  2. Choose Pivot (const pivot = array[array.length - 1]): Selects the last element as the pivot.
  3. Create Partition Arrays (const left = []; const right = []): Initialize empty arrays to hold elements smaller and larger than the pivot.
  4. Partition Loop (for (let i = 0; i < array.length - 1; i++)): Iterate through the array (excluding the pivot).
  5. Populate Partitions: If array[i] < pivot, push array[i] to left. Otherwise (if array[i] >= pivot), push it to right.
  6. Recursive Calls and Concatenation (return [...quickSort(left), pivot, ...quickSort(right)]): Recursively sort the left and right arrays. The final sorted array is constructed by concatenating the sorted left array, the pivot (which is now in its correct position relative to the partitions), and the sorted right array.
// quick.ts - Core Logic
export function quickSort(array: number[]): number[] {
  // Base case: array with 0 or 1 element is sorted
  if (array.length <= 1) {
    return array
  }
 
  // Choose the last element as pivot
  const pivot = array[array.length - 1]
  const left: number[] = [] // Elements smaller than pivot
  const right: number[] = [] // Elements greater than or equal to pivot
 
  // Partition the array (excluding the pivot itself)
  for (let i = 0; i < array.length - 1; i++) {
    if (array[i] < pivot) {
      left.push(array[i])
    } else {
      right.push(array[i])
    }
  }
 
  // Recursively sort sub-arrays and combine
  return [...quickSort(left), pivot, ...quickSort(right)]
}

Time Complexity

Space Complexity


5. Merge Sort

Concept/Mechanism

Merge Sort is another efficient, divide-and-conquer algorithm. It works as follows:

  1. Divide: If the list has more than one element, divide it recursively into two halves (approximately equal) until you have sublists of size 1. A list of size 1 is considered sorted.
  2. Conquer (Merge): Repeatedly merge the sorted sublists back together, creating larger sorted sublists, until you have merged all sublists into a single, fully sorted list. The merging process involves comparing elements from the two sorted sublists and placing the smaller element into a new, combined list.

Algorithm Steps (Code Walkthrough - mergeSort)

  1. Base Case (if (arr.length <= 1)): An array with 0 or 1 element is sorted.
  2. Divide (const middle = Math.floor(arr.length / 2); const left = ...; const right = ...): Find the middle index. Recursively call mergeSort on the left half (arr.slice(0, middle)) and the right half (arr.slice(middle)). Note that slice creates copies.
  3. Merge (const result = []; while(...) {...}):
    • Initialize an empty result array and pointers i (for left) and j (for right) to 0.
    • While both left and right sublists have elements remaining (i < left.length && j < right.length):
      • Compare left[i] and right[j].
      • Push the smaller element onto the result array and increment the corresponding pointer (i or j).
    • After the loop, one of the sublists might still have remaining elements (which are guaranteed to be larger than all elements currently in result).
  4. Concatenate Remaining Elements (return result.concat(...)): Append any remaining elements from the left sublist (left.slice(i)) and the right sublist (right.slice(j)) to the result. Only one of these slices will actually contain elements.
// merge.ts - Core Logic
export function mergeSort(arr: number[]): number[] {
  // Base case: array with 0 or 1 element is sorted
  if (arr.length <= 1) {
    return arr
  }
 
  // Find the middle point and divide the array
  const middle = Math.floor(arr.length / 2)
  // Recursively sort the left and right halves
  const left = mergeSort(arr.slice(0, middle))
  const right = mergeSort(arr.slice(middle))
 
  // Merge the sorted halves
  const result: number[] = []
  let i = 0, // Pointer for left array
    j = 0   // Pointer for right array
 
  // Compare elements from left and right, add smaller to result
  while (i < left.length && j < right.length) {
    if (left[i] < right[j]) {
      result.push(left[i])
      i++
    } else {
      result.push(right[j])
      j++
    }
  }
 
  // Concatenate any remaining elements (from either left or right)
  return result.concat(left.slice(i)).concat(right.slice(j))
}

Time Complexity

Merge Sort consistently divides the array into halves, resulting in a recursion depth of O(log n). The merging process at each level takes O(n) time because every element needs to be looked at and placed into the result array. This leads to a reliable O(n log n) performance regardless of the initial order of elements.

Space Complexity


Comparison Summary

| Algorithm | Time Complexity (Best) | Time Complexity (Average) | Time Complexity (Worst) | Space Complexity (Worst) | In-Place | Stable* | Notes | | :------------- | :--------------------- | :------------------------ | :---------------------- | :----------------------- | :------- | :------ | :--------------------------------------------- | | Bubble Sort | O(n)¹ | O(n²) | O(n²) | O(1) | Yes | Yes | Simple but inefficient for large lists. | | Insertion Sort | O(n) | O(n²) | O(n²) | O(1) | Yes | Yes | Efficient for small or nearly sorted lists. | | Selection Sort | O(n²) | O(n²) | O(n²) | O(1) | Yes | No | Simple, minimizes swaps, inefficient compares. | | Quick Sort | O(n log n) | O(n log n) | O(n²) | O(n)² / O(log n)³ | No² / Yes³| No | Generally fastest on average, recursive. | | Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | No | Yes | Consistent performance, requires extra space. |

General Complexity Concepts

Conclusion

The choice of sorting algorithm depends heavily on the specific requirements of the application.

Understanding the mechanisms and trade-offs (time vs. space complexity, best/average/worst cases) of these fundamental algorithms is essential for developing efficient software.