Skip to main content
  1. Data Structures and Algorithms/

Sorting algorithms

·1267 words·6 mins·
Data Structures Algorithms Notes
Ifarra
Author
Ifarra
Disturbing the peace!!
Table of Contents
Data-Structures-and-Algorithms - This article is part of a series.
Part 3: This Article

A Hands-On Guide to Sorting Algorithms: Examples and Explanations
#

Sorting algorithms are essential tools in computer science, allowing us to organize data efficiently. In this article, we’ll explore several common sorting algorithms through examples, so you can see how they work in practice.

1. Bubble Sort
#

How It Works
#

Bubble Sort is a straightforward algorithm that repeatedly steps through the list, compares adjacent elements, and swaps them if they are in the wrong order. This process continues until the list is sorted.

Time Complexity:

  • Best Case: \( O(n) \) (already sorted)
  • Average Case: \( O(n^2) \)
  • Worst Case: \( O(n^2) \)

Example
#

Imagine you have the following array:

[5, 2, 9, 1, 5, 6]

Step-by-step process:

  1. Compare 5 and 2. Since 5 > 2, swap them.
    [2, 5, 9, 1, 5, 6]
    
  2. Compare 5 and 9. No swap needed.
  3. Compare 9 and 1. Swap.
    [2, 5, 1, 9, 5, 6]
    
  4. Continue until the end of the array. Repeat the process until no swaps are needed.

Code Example
#

def bubble_sort(arr):
    n = len(arr)
    for i in range(n):
        for j in range(0, n-i-1):
            if arr[j] > arr[j+1]:
                arr[j], arr[j+1] = arr[j+1], arr[j]
    return arr

print(bubble_sort([5, 2, 9, 1, 5, 6]))  # Output: [1, 2, 5, 5, 6, 9]

2. Selection Sort
#

How It Works
#

Selection Sort divides the input list into a sorted and an unsorted section. It repeatedly selects the smallest element from the unsorted section and moves it to the end of the sorted section.

Time Complexity:

  • Best Case: \( O(n^2) \)
  • Average Case: \( O(n^2) \)
  • Worst Case: \( O(n^2) \)

Example
#

Consider the array:

[29, 10, 14, 37, 13]

Step-by-step process:

  1. Find the smallest element (10) and swap it with the first element.
    [10, 29, 14, 37, 13]
    
  2. Find the next smallest element (13) and swap it with the second element.
    [10, 13, 14, 37, 29]
    
  3. Repeat until the whole array is sorted.

Code Example
#

def selection_sort(arr):
    n = len(arr)
    for i in range(n):
        min_index = i
        for j in range(i + 1, n):
            if arr[j] < arr[min_index]:
                min_index = j
        arr[i], arr[min_index] = arr[min_index], arr[i]
    return arr

print(selection_sort([29, 10, 14, 37, 13]))  # Output: [10, 13, 14, 29, 37]

3. Insertion Sort
#

How It Works
#

Insertion Sort builds a sorted array one element at a time. It takes an element from the unsorted section and places it in the correct position within the sorted section.

Time Complexity:

  • Best Case: \( O(n) \) (already sorted)
  • Average Case: \( O(n^2) \)
  • Worst Case: \( O(n^2) \)

Example
#

Given the array:

[5, 3, 4, 1, 2]

Step-by-step process:

  1. Start with the first element (5) as sorted.
  2. Take the next element (3) and insert it into the sorted section:
    [3, 5, 4, 1, 2]
    
  3. Take 4 and insert it:
    [3, 4, 5, 1, 2]
    
  4. Continue until the array is fully sorted.

Code Example
#

def insertion_sort(arr):
    for i in range(1, len(arr)):
        key = arr[i]
        j = i - 1
        while j >= 0 and key < arr[j]:
            arr[j + 1] = arr[j]
            j -= 1
        arr[j + 1] = key
    return arr

print(insertion_sort([5, 3, 4, 1, 2]))  # Output: [1, 2, 3, 4, 5]

4. Merge Sort
#

How It Works
#

Merge Sort is a divide-and-conquer algorithm that splits the array into halves, recursively sorts them, and then merges the sorted halves back together.

Time Complexity:

  • Best Case: \( O(n \log n) \)
  • Average Case: \( O(n \log n) \)
  • Worst Case: \( O(n \log n) \)

Example
#

For the array:

[38, 27, 43, 3, 9, 82, 10]

Step-by-step process:

  1. Split the array into two halves:
    [38, 27, 43] and [3, 9, 82, 10]
    
  2. Keep splitting until you have single-element arrays.
  3. Start merging:
    [27, 38, 43] and [3, 9, 10, 82]
    
  4. Combine the sorted halves.

Code Example
#

def merge_sort(arr):
    if len(arr) > 1:
        mid = len(arr) // 2
        left_half = arr[:mid]
        right_half = arr[mid:]

        merge_sort(left_half)
        merge_sort(right_half)

        i = j = k = 0
        while i < len(left_half) and j < len(right_half):
            if left_half[i] < right_half[j]:
                arr[k] = left_half[i]
                i += 1
            else:
                arr[k] = right_half[j]
                j += 1
            k += 1

        while i < len(left_half):
            arr[k] = left_half[i]
            i += 1
            k += 1

        while j < len(right_half):
            arr[k] = right_half[j]
            j += 1
            k += 1

    return arr

print(merge_sort([38, 27, 43, 3, 9, 82, 10]))  # Output: [3, 9, 10, 27, 38, 43, 82]

5. Quick Sort
#

How It Works
#

Quick Sort is another divide-and-conquer algorithm that selects a ‘pivot’ element and partitions the other elements into two sub-arrays according to whether they are less than or greater than the pivot.

Time Complexity:

  • Best Case: \( O(n \log n) \)
  • Average Case: \( O(n \log n) \)
  • Worst Case: \( O(n^2) \) (occurs when the smallest or largest element is always chosen as the pivot)

Example
#

Take the array:

[10, 80, 30, 90, 40, 50, 70]

Step-by-step process:

  1. Choose a pivot (e.g., 50).
  2. Rearrange the array:
    [10, 30, 40, 50, 90, 80, 70]
    
  3. Recursively apply the same logic to the sub-arrays.

Code Example
#

def quick_sort(arr):
    if len(arr) <= 1:
        return arr
    pivot = arr[len(arr) // 2]
    left = [x for x in arr if x < pivot]
    middle = [x for x in arr if x == pivot]
    right = [x for x in arr if x > pivot]
    return quick_sort(left) + middle + quick_sort(right)

print(quick_sort([10, 80, 30, 90, 40, 50, 70]))  # Output: [10, 30, 40, 50, 70, 80, 90]

6. Heap Sort
#

How It Works
#

Heap Sort uses a binary heap data structure to sort elements. It first builds a max heap from the input data and then repeatedly extracts the maximum element from the heap.

Time Complexity:

  • Best Case: \( O(n \log n) \)
  • Average Case: \( O(n \log n) \)
  • Worst Case: \( O(n \log n) \)

Example
#

Given the array:

[3, 5, 1, 10, 2]

Step-by-step process:

  1. Build a max heap:
    [10, 5, 3, 1, 2]
    
  2. Remove the largest element (10) and rebuild the heap.
  3. Continue until the array is sorted.

Code Example
#

def heapify(arr, n, i):
    largest = i
    left = 2 * i + 1
    right = 2 * i + 2

    if left < n and arr[i] < arr[left]:
        largest = left
    if right < n and arr[largest] < arr[right]:
        largest = right

    if largest != i:
        arr[i], arr[largest] = arr[largest], arr[i]
        heapify(arr, n, largest)

def heap_sort(arr):
    n = len(arr)
    for i in range(n // 2 - 1, -1, -1):
        heapify(arr, n, i)
    for i in range(n - 1, 0, -1):
        arr[i], arr[0] = arr[0], arr[i]
        heapify(arr, i, 0)
    return arr

print(heap_sort([3, 5, 1, 10, 2]))  # Output: [1, 2, 3, 5, 10]

When to Use Which Sorting Algorithm
#

  • Bubble Sort, Selection Sort, and Insertion Sort: Generally used for small datasets due to their simplicity, but they are inefficient for larger datasets.
  • Merge Sort: Preferred for larger datasets and when stability is needed (preserving the order of equal elements).
  • Quick Sort: Often the fastest in practice for average cases, but care is needed to avoid worst-case scenarios.
  • Heap Sort: Useful when memory usage is a concern and when you need a good worst-case runtime.

Conclusion
#

Sorting algorithms are foundational to understanding data manipulation in computer science. From simple methods like Bubble Sort to more advanced techniques like Quick Sort and Merge Sort, each algorithm has its strengths and use cases. By experimenting with these examples, you can gain a deeper appreciation for how sorting works and when to apply each method effectively. Happy coding!

Data-Structures-and-Algorithms - This article is part of a series.
Part 3: This Article

Related

Guide to Data Structures and Algorithms
·794 words·4 mins
Data Structures Algorithms Notes
Data Structures and Algorithms (DSA) form the foundation of computer science and software development.
Understanding Big O Notation
·811 words·4 mins
Data Structures Algorithms Notes
Understanding the Bio O notaion in Data Structures and Algorithms (DSA)