Sorting Algorithms Continued: What Is Recursion?

Jacob Knopf
3 min readJul 25, 2020

Last week I introduced how to implement Bubble Sort, Insertion Sort, and Selection Sort with code examples, and in this post I’m going to show two new algorithms — Quicksort and Mergesort. Both of them rely on the “divide and conquer” principle, which will be explained in more detail later. However, in order to begin talking about these sorting methods you have to know about recursion. So what is recursion you ask?

Recursion — Essentially this technique is a process where a function continuously calls on itself, directly or indirectly, until a specific condition is met/it reaches the necessary stopping point defined in the function. With each iteration a smaller portion of the original data set is called upon. If the stopping point isn’t met or defined correctly you can get caught in an infinite loop. Using a recursive algorithm can solve certain problems very easily, and generally is better with larger data sets than the three algorithms I discussed last week. Now let’s get into Quicksort and Mergesort.

Quicksort — This algorithm works by picking a pivot element in the collection, and then sorts the items around that key based on whether their values are less than or greater than the pivot. It’s also known as “partition exchange sort.” After the first iteration when all the elements on the left-hand side of the key are less in value and all the items on the right-hand side are greater in value, the array is broken down into two subarrays and Quicksort is applied to both of them. New pivots are picked for each, the smaller sublists are sorted according to those new keys, and then broken down again until the original collection is in order. Below is a Ruby implementation using Quicksort along with the expected output. There are different variations to pick the specific pivot value, but for my example I’m doing it at random.

a = [1, 5, 4, 7, 3, 2, -1, -3, -9, -7]def quick_sort(a)

return a if a.length <= 1

pivot = a.delete_at(rand(a.length))
left = []

right = []
a.each do |i|

if i <= pivot

left << i

else

right << i

end

end

return *quick_sort(left), pivot ,*quick_sort(right)
endquick_sort(a)=> [-9, -7, -3, -1, 1, 2, 3, 4, 5, 7]

Mergesort — The idea behind this algorithm is to split the array into two equal halves and then continue breaking the collection into smaller halves until they cannot be divided anymore. These values that cannot be broken down further are known as “atomic values.” Once this is achieved each of the subarrays are merged back together, but this time in a sorted manner. This method is also recursive, like Quicksort, calling on itself again and again until the specified condition is met/the solution is found. Here is a Ruby implementation utilizing Mergesort along with the expected output.

a = [5, 8, 2, 4, -3, -8, -9, 2, 4, 7]def merge_sort(a)  if a.length <= 1    a  else    mid = (a.length/ 2).floor    left = merge_sort(a[0..mid-1])    right = merge_sort(a[mid..a.length])    merge(left, right)  endenddef merge(left, right)  if left.empty?    right  elsif right.empty?    left  elsif left[0] < right[0]    [left[0]] + merge(left[1..left.length], right)  else    [right[0]] + merge(left, right[1..right.length])  endendmerge_sort(a)=> [-9, -8, -3, 2, 2, 4, 4, 5, 7, 8]

Understanding recursion was pretty difficult for me in the beginning, and it was easy to get lost in all of the subcollections. However, if you want to pass that next technical interview, recursion must become your new best friend. Both Quicksort and Mergesort are well suited for sorting large data sets, with an average big O notation of O(n log n), and any company you want to work for in 2020 deals with an inordinate amount of data at this point. Thanks for reading, and good luck to everybody out there job hunting!

--

--

Jacob Knopf

QA Lead | Content Designer | UX Researcher | Developer Evangelist