Demystifying Sorting Algorithms: A Deep Dive into Selection Sort
So you've dipped your toes with arrays and linked lists (here's a link to the tutorial for a quick refresher), and you've understood the Big O Notation (check out the article if you haven't yet). It's time to unpack the mechanics of the Selection Sort algorithm.
Let's imagine you have a list of artists based on their play-count on your computer:
Artist | Plays count |
---|---|
Van Halen | 132 |
The Rolling Stones | 353 |
Metallica | 220 |
Nirvana | 185 |
Aerosmith | 157 |
Led Zeppelin | 238 |
Red Hot Chili Peppers | 201 |
You'd like to sort this list from high to low play-count to categorize your favorite artists. One way to do it would be to pull the most played artist from your list and add it to a new one, then repeat this process until everybody's accounted for:
Artist | Plays count |
---|---|
The Rolling Stones | 353 |
Led Zeppelin | 238 |
Metallica | 220 |
Red Hot Chili Peppers | 201 |
Nirvana | 185 |
Aerosmith | 157 |
Van Halen | 132 |
Evaluating the Algorithm
To find the artist with the most plays, we check each item in the list, a process with run-time O(n)
. To find the second most popular artist, you'll again need to check the list item by item, again O(n)
. As you see, when you want to find the next most popular artist, you run the same code n
times. This makes the running time O(n²)
.
What About Fewer Elements Each Time?
You'd be right to observe that as you proceed, the number of elements to examine shrinks, eventually leaving only a single element. So why the O(n²)
complexity? The answer lies in the specifics of Big O Notation.
You first check n
elements, then n - 1
, n - 2
, and so forth, which averages out to checking 1/2 * n
elements. The running time becomes O(n * 1/2 * n)
. However, in Big O Notation, we disregard constants like 1/2
, leaving us with O(n * n)
or O(n²)
.
Implementing the Algorithm
Although a good algorithm, the Selection Sort isn't particularly fast. The Quicksort, for example, is faster with a complexity of O(n log n)
— a topic for another post.
Here's a code snippet of how you'd sort an array of numbers using the selection sort algorithm in JavaScript:
function searchForSmallest(arr) {
let smallest = arr[0]
let smallestIndex = 0
for (let i = 0; i < arr.length; i++) {
if (arr[i] < smallest) {
smallest = arr[i]
smallestIndex = i
}
}
return smallestIndex
}
function selectionSort(arr) {
const newArr = []
while (arr.length) {
const smallestIndex = searchForSmallest(arr)
newArr.push(arr.splice(smallestIndex, 1)[0])
}
return newArr
}
console.log(selectionSort([5, 3, 6, 2, 10])) // [2, 3, 5, 6, 10]
You can find this script in my GitHub algorithms repository here. A more visual and interactive dig into the algorithm is available in this Nodepad note I've prepared for you.
Conclusion
And there you have it, folks! The selection sort algorithm might not be the fastest, but it's straightforward and a great starting point for understanding the mechanics of sorting algorithms. It thrives in simplicity, and its concept can be translated easily into real-world tasks.
Remember, in programming (and in life), understanding the basics is key to tackling even the most complex problems. As you've ventured into Selection Sort, remember that this foundation will carry you forward in your journey towards comprehending more efficient algorithms.
Keep your eyes on our future discussions about sorting algorithms like Quicksort, and let's continue expanding our arsenal together!
Don't hesitate to revisit this piece anytime you need to refresh your Selection Sort knowledge. As always, happy coding!