\(\) \(%\newcommand{\CLRS}{\href{https://mitpress.mit.edu/books/introduction-algorithms}{Cormen et al}}\) \(\newcommand{\CLRS}{\href{https://mitpress.mit.edu/books/introduction-algorithms}{Introduction to Algorithms, 3rd ed.} (\href{http://libproxy.aalto.fi/login?url=http://site.ebrary.com/lib/aalto/Doc?id=10397652}{Aalto access})}\) \(%\newcommand{\CLRS}{\href{https://mitpress.mit.edu/books/introduction-algorithms}{Introduction to Algorithms, 3rd ed.} (\href{http://libproxy.aalto.fi/login?url=http://site.ebrary.com/lib/aalto/Doc?id=10397652}{online via Aalto lib})}\) \(\newcommand{\SW}{\href{http://algs4.cs.princeton.edu/home/}{Algorithms, 4th ed.}}\) \(%\newcommand{\SW}{\href{http://algs4.cs.princeton.edu/home/}{Sedgewick and Wayne}}\) \(\) \(\newcommand{\Java}{\href{http://java.com/en/}{Java}}\) \(\newcommand{\Python}{\href{https://www.python.org/}{Python}}\) \(\newcommand{\CPP}{\href{http://www.cplusplus.com/}{C++}}\) \(\newcommand{\ST}[1]{{\Blue{\textsf{#1}}}}\) \(\newcommand{\PseudoCode}[1]{{\color{blue}\textsf{#1}}}\) \(\) \(%\newcommand{\subheading}[1]{\textbf{\large\color{aaltodgreen}#1}}\) \(\newcommand{\subheading}[1]{\large{\usebeamercolor[fg]{frametitle} #1}}\) \(\) \(\newcommand{\Blue}[1]{{\color{flagblue}#1}}\) \(\newcommand{\Red}[1]{{\color{aaltored}#1}}\) \(\newcommand{\Emph}[1]{\emph{\color{flagblue}#1}}\) \(\) \(\newcommand{\Engl}[1]{({\em engl.}\ #1)}\) \(\) \(\newcommand{\Pointer}{\raisebox{-1ex}{\huge\ding{43}}\ }\) \(\) \(\newcommand{\Set}[1]{\{#1\}}\) \(\newcommand{\Setdef}[2]{\{{#1}\mid{#2}\}}\) \(\newcommand{\PSet}[1]{\mathcal{P}(#1)}\) \(\newcommand{\Card}[1]{{\vert{#1}\vert}}\) \(\newcommand{\Tuple}[1]{(#1)}\) \(\newcommand{\Implies}{\Rightarrow}\) \(\newcommand{\Reals}{\mathbb{R}}\) \(\newcommand{\Seq}[1]{(#1)}\) \(\newcommand{\Arr}[1]{[#1]}\) \(\newcommand{\Floor}[1]{{\lfloor{#1}\rfloor}}\) \(\newcommand{\Ceil}[1]{{\lceil{#1}\rceil}}\) \(\newcommand{\Path}[1]{(#1)}\) \(\) \(%\newcommand{\Lg}{\lg}\) \(\newcommand{\Lg}{\log_2}\) \(\) \(\newcommand{\BigOh}{O}\) \(%\newcommand{\BigOh}{\mathcal{O}}\) \(\newcommand{\Oh}[1]{\BigOh(#1)}\) \(\) \(\newcommand{\todo}[1]{\Red{\textbf{TO DO: #1}}}\) \(\) \(\newcommand{\NULL}{\textsf{null}}\) \(\) \(\newcommand{\Insert}{\ensuremath{\textsc{insert}}}\) \(\newcommand{\Search}{\ensuremath{\textsc{search}}}\) \(\newcommand{\Delete}{\ensuremath{\textsc{delete}}}\) \(\newcommand{\Remove}{\ensuremath{\textsc{remove}}}\) \(\) \(\newcommand{\Parent}[1]{\mathop{parent}(#1)}\) \(\) \(\newcommand{\ALengthOf}[1]{{#1}.\textit{length}}\) \(\) \(\newcommand{\TRootOf}[1]{{#1}.\textit{root}}\) \(\newcommand{\TLChildOf}[1]{{#1}.\textit{leftChild}}\) \(\newcommand{\TRChildOf}[1]{{#1}.\textit{rightChild}}\) \(\newcommand{\TNode}{x}\) \(\newcommand{\TNodeI}{y}\) \(\newcommand{\TKeyOf}[1]{{#1}.\textit{key}}\) \(\) \(\newcommand{\PEnqueue}[2]{{#1}.\textsf{enqueue}(#2)}\) \(\newcommand{\PDequeue}[1]{{#1}.\textsf{dequeue}()}\) \(\) \(\newcommand{\Def}{\mathrel{:=}}\) \(\) \(\newcommand{\Eq}{\mathrel{=}}\) \(\newcommand{\Asgn}{\mathrel{\leftarrow}}\) \(%\newcommand{\Asgn}{\mathrel{:=}}\) \(\) \(%\) \(% Heaps\) \(%\) \(\newcommand{\Downheap}{\textsc{downheap}}\) \(\newcommand{\Upheap}{\textsc{upheap}}\) \(\newcommand{\Makeheap}{\textsc{makeheap}}\) \(\) \(%\) \(% Dynamic sets\) \(%\) \(\newcommand{\SInsert}[1]{\textsc{insert}(#1)}\) \(\newcommand{\SSearch}[1]{\textsc{search}(#1)}\) \(\newcommand{\SDelete}[1]{\textsc{delete}(#1)}\) \(\newcommand{\SMin}{\textsc{min}()}\) \(\newcommand{\SMax}{\textsc{max}()}\) \(\newcommand{\SPredecessor}[1]{\textsc{predecessor}(#1)}\) \(\newcommand{\SSuccessor}[1]{\textsc{successor}(#1)}\) \(\) \(%\) \(% Union-find\) \(%\) \(\newcommand{\UFMS}[1]{\textsc{make-set}(#1)}\) \(\newcommand{\UFFS}[1]{\textsc{find-set}(#1)}\) \(\newcommand{\UFCompress}[1]{\textsc{find-and-compress}(#1)}\) \(\newcommand{\UFUnion}[2]{\textsc{union}(#1,#2)}\) \(\) \(\) \(%\) \(% Graphs\) \(%\) \(\newcommand{\Verts}{V}\) \(\newcommand{\Vtx}{v}\) \(\newcommand{\VtxA}{v_1}\) \(\newcommand{\VtxB}{v_2}\) \(\newcommand{\VertsA}{V_\textup{A}}\) \(\newcommand{\VertsB}{V_\textup{B}}\) \(\newcommand{\Edges}{E}\) \(\newcommand{\Edge}{e}\) \(\newcommand{\NofV}{\Card{V}}\) \(\newcommand{\NofE}{\Card{E}}\) \(\newcommand{\Graph}{G}\) \(\) \(\newcommand{\SCC}{C}\) \(\newcommand{\GraphSCC}{G^\text{SCC}}\) \(\newcommand{\VertsSCC}{V^\text{SCC}}\) \(\newcommand{\EdgesSCC}{E^\text{SCC}}\) \(\) \(\newcommand{\GraphT}{G^\text{T}}\) \(%\newcommand{\VertsT}{V^\textup{T}}\) \(\newcommand{\EdgesT}{E^\text{T}}\) \(\) \(%\) \(% NP-completeness etc\) \(%\) \(\newcommand{\Poly}{\textbf{P}}\) \(\newcommand{\NP}{\textbf{NP}}\) \(\newcommand{\PSPACE}{\textbf{PSPACE}}\) \(\newcommand{\EXPTIME}{\textbf{EXPTIME}}\)
\(\newcommand{\LO}{{\textit{lo}}}\) \(\newcommand{\HI}{{\textit{hi}}}\)

Basic definitions

In this round, we study algorithms for sorting sequences, usually arrays, of elements. We assume that a sorting algorithm receives an input array to be sorted, and then rearranges the elements in the same array so that they are in ascending order. That is, the original input array gets modified. For instance, the Java.util.Arrays.sort methods and the C++ standard library sort function work in this way. On the contrary, in Scala the sorted method on sequences does not modify the original array but returns a new array containing the ordered sequence. Under the hood however, it allocates an auxiliary array and applies the Java.util.Arrays.sort function on that copy (see the source code).

For the sake of illustration simplicity, we usually use arrays of integers in the examples. In this case, the elements are sorted by using the usual numeric ascending order \(\le\). For example, an array 7,2,11,5,7 of five integers is sorted into 2,5,7,7,11. Observe that an array can contain multiple equal elements as just illustrated.

However, one should keep in mind that most of the algorithms extend in a straightforward way to arrays of other types of primitive (integer, floating point number etc) or non-primitive types (strings, pairs, unbounded integers such as Scala BigInt, generic objects) as well. If the elements are non-primitive data types, then in the current version of Java, and thus also in Scala, the sequence contains references to the actual objects, making the swapping of the positions of two objects in the array cheap as one only has to move the references. For instance, swapping the positions of the strings “CS-A1110” and “CS-A1140” in the array

_images/sorting-strings-1.png

results in the array

_images/sorting-strings-2.png

In some other languages, such as C and C++, non-primitive objects can be stored inside the sequences as well. In such a case, swapping the positions of two objects may be more costly.

For all kinds of elements, most sorting algorithms allow the ordering between the elements to be given as input as well. For instance, in Scala we could sort a sequence of floating point numbers into descending order as follows:

scala> val a = Array(1.0,-3.5,5.0,-11.2,24.0)
a: Array[Double] = Array(1.0, -3.5, 5.0, -11.2, 24.0)

scala> a.sortWith(_ > _)
res0: Array[Double] = Array(24.0, 5.0, 1.0, -3.5, -11.2)

Above, the anonymous function _ > _ serves as the less-than ordering argument of the sortWith method. As the result, the sorting algorithm sorts the elements in the ascending order as usual but now the order it considers makes it think that “\(x < y\)” when actually \(x > y\).

Structured objects can also be sorted with respect to some specific field or substructure called the key. For instance, below the employees are sorted by their age. Note that, again, two distinct objects can be considered equal with respect to the key.

_images/sorting-keys.png

In the running time analyses of sorting algorithms, we assume that random access, meaning reading and writing an element at any position, in the sequence can be done in constant time. Furthermore, for simplicity, we assume that comparing two elements with respect to the applied element ordering can be done in constant time as well. For non-primitive data type objects, such as long strings or complex structured data fields, this may not be the case. In such cases, one could also consider other efficiency aspects such as the numbers of comparisons performed by the algorithms.

Stability

Definition: Stable sorting algorithm

A sorting algorithm is stable if it keeps the relative order of equal elements unchanged.

In Scala, the sorted and sortWith methods of sequences implementing SeqLike, such as List and Array, are stable. In addition, sortBy is implemented by using sorted and is also stable.

As an example, we can sort a list

val l = List(("c",2),("b",3),("e",2),("a",3),("b",2),("a",2))

to lexicographical order (first field more significant) by first sorting by the second field

scala> val tmp = l.sortBy(_._2)
tmp = List((c,2), (e,2), (b,2), (a,2), (b,3), (a,3))

and then by the first field with a stable sorting algorithm

scala> val result = tmp.sortBy(_._1)
result = List((a,2), (a,3), (b,2), (b,3), (c,2), (e,2))

In the last step, a stable sorting algorithm keeps the relative order of (a,2) and (a,3) unchanged even though their first fields are the same.

In Java, the java.util.Arrays.sort methods for arrays of objects are stable.

The C++ algorithm library includes separate functions implementing

  • ​ stable sorting (the stable_sort function), and

  • ​ sorting that is not quaranteed to be stable (the sort function).

The latter has a slightly better asymptotic worst-case performance in the case there is not enough available memory for an auxiliary array.

Extra memory requirements and in-place sorting

Different sorting algorithms also vary in how much extra memory they need during their operation.

Definition

A sorting algorithm needs \(g(n)\) units of extra memory if, when sorting any sequence of length \(n\), the maximum amount of extra memory needed is \(g(n)\). The input/output sequence memory is not counted in \(g(n)\).

As we are usually not interested in the exact amount of extra memory needed but in the growth rate of the extra memory requirements when the input size grows, we use the \(\BigOh\) and \(\Theta\) notations here as well. For instance, some sorting algorithms only use a constant \(\Theta(1)\) amount of extra memory and may thus be very well suited for embedded systems with a limited amount of RAM. On the other hand, some sorting algorithms use \(\Theta(n)\) amount of extra memory because they allocate an auxiliary array of \(n\) elements for computing some intermediate results; this may be justified when both stability and quite good performance are required at the same time.

In addition to the asymptotic extra memory requirement analysis, one regularly also encounters the concept of sorting “in place”. For instance, the following definition is used in the book Introduction to Algorithms, 3rd ed. (online via Aalto lib):

Definition: In-place sorting algorithm

A sorting algorithm works in place if only a constant amount of the array elements are stored outside the array at any moment of time.

The definition above is quite liberal, allowing, for instance, the use of a linear amount of array indices in the recursive call stack of the algorithm. Stricter versions do exist, see, for instance, this Wikipedia page for discussion.