\(\newcommand{\Java}{\href{http://java.com/en/}{Java}}\) \(\newcommand{\Python}{\href{https://www.python.org/}{Python}}\) \(\newcommand{\CPP}{\href{http://www.cplusplus.com/}{C++}}\) \(\newcommand{\ST}[1]{{\Blue{\textsf{#1}}}}\) \(\newcommand{\PseudoCode}[1]{{\color{blue}\textsf{#1}}}\) \(%\newcommand{\subheading}[1]{\textbf{\large\color{aaltodgreen}#1}}\) \(\newcommand{\subheading}[1]{\large{\usebeamercolor[fg]{frametitle} #1}}\) \(\newcommand{\Blue}[1]{{\color{flagblue}#1}}\) \(\newcommand{\Red}[1]{{\color{aaltored}#1}}\) \(\newcommand{\Emph}[1]{\emph{\color{flagblue}#1}}\) \(\newcommand{\Engl}[1]{({\em engl.}\ #1)}\) \(\newcommand{\Pointer}{\raisebox{-1ex}{\huge\ding{43}}\ }\) \(\newcommand{\Set}[1]{\{#1\}}\) \(\newcommand{\Setdef}[2]{\{{#1}\mid{#2}\}}\) \(\newcommand{\PSet}[1]{\mathcal{P}(#1)}\) \(\newcommand{\Card}[1]{{\vert{#1}\vert}}\) \(\newcommand{\Tuple}[1]{(#1)}\) \(\newcommand{\Implies}{\Rightarrow}\) \(\newcommand{\Reals}{\mathbb{R}}\) \(\newcommand{\Seq}[1]{(#1)}\) \(\newcommand{\Arr}[1]{[#1]}\) \(\newcommand{\Floor}[1]{{\lfloor{#1}\rfloor}}\) \(\newcommand{\Ceil}[1]{{\lceil{#1}\rceil}}\) \(\newcommand{\Path}[1]{(#1)}\) \(%\newcommand{\Lg}{\lg}\) \(\newcommand{\Lg}{\log_2}\) \(\newcommand{\BigOh}{O}\) \(\newcommand{\Oh}[1]{\BigOh(#1)}\) \(\newcommand{\todo}[1]{\Red{\textbf{TO DO: #1}}}\) \(\newcommand{\NULL}{\textsf{null}}\) \(\newcommand{\Insert}{\ensuremath{\textsc{insert}}}\) \(\newcommand{\Search}{\ensuremath{\textsc{search}}}\) \(\newcommand{\Delete}{\ensuremath{\textsc{delete}}}\) \(\newcommand{\Remove}{\ensuremath{\textsc{remove}}}\) \(\newcommand{\Parent}[1]{\mathop{parent}(#1)}\) \(\newcommand{\ALengthOf}[1]{{#1}.\textit{length}}\) \(\newcommand{\TRootOf}[1]{{#1}.\textit{root}}\) \(\newcommand{\TLChildOf}[1]{{#1}.\textit{leftChild}}\) \(\newcommand{\TRChildOf}[1]{{#1}.\textit{rightChild}}\) \(\newcommand{\TNode}{x}\) \(\newcommand{\TNodeI}{y}\) \(\newcommand{\TKeyOf}[1]{{#1}.\textit{key}}\) \(\newcommand{\PEnqueue}[2]{{#1}.\textsf{enqueue}(#2)}\) \(\newcommand{\PDequeue}[1]{{#1}.\textsf{dequeue}()}\) \(\newcommand{\Def}{\mathrel{:=}}\) \(\newcommand{\Eq}{\mathrel{=}}\) \(\newcommand{\Asgn}{\mathrel{\leftarrow}}\) \(%\newcommand{\Asgn}{\mathrel{:=}}\) \(%\) \(% Heaps\) \(%\) \(\newcommand{\Downheap}{\textsc{downheap}}\) \(\newcommand{\Upheap}{\textsc{upheap}}\) \(\newcommand{\Makeheap}{\textsc{makeheap}}\) \(%\) \(% Dynamic sets\) \(%\) \(\newcommand{\SInsert}[1]{\textsc{insert}(#1)}\) \(\newcommand{\SSearch}[1]{\textsc{search}(#1)}\) \(\newcommand{\SDelete}[1]{\textsc{delete}(#1)}\) \(\newcommand{\SMin}{\textsc{min}()}\) \(\newcommand{\SMax}{\textsc{max}()}\) \(\newcommand{\SPredecessor}[1]{\textsc{predecessor}(#1)}\) \(\newcommand{\SSuccessor}[1]{\textsc{successor}(#1)}\) \(%\) \(% Union-find\) \(%\) \(\newcommand{\UFMS}[1]{\textsc{make-set}(#1)}\) \(\newcommand{\UFFS}[1]{\textsc{find-set}(#1)}\) \(\newcommand{\UFCompress}[1]{\textsc{find-and-compress}(#1)}\) \(\newcommand{\UFUnion}[2]{\textsc{union}(#1,#2)}\) \(%\) \(% Graphs\) \(%\) \(\newcommand{\Verts}{V}\) \(\newcommand{\Vtx}{v}\) \(\newcommand{\VtxA}{v_1}\) \(\newcommand{\VtxB}{v_2}\) \(\newcommand{\VertsA}{V_\textup{A}}\) \(\newcommand{\VertsB}{V_\textup{B}}\) \(\newcommand{\Edges}{E}\) \(\newcommand{\Edge}{e}\) \(\newcommand{\NofV}{\Card{V}}\) \(\newcommand{\NofE}{\Card{E}}\) \(\newcommand{\Graph}{G}\) \(\newcommand{\SCC}{C}\) \(\newcommand{\GraphSCC}{G^\text{SCC}}\) \(\newcommand{\VertsSCC}{V^\text{SCC}}\) \(\newcommand{\EdgesSCC}{E^\text{SCC}}\) \(\newcommand{\GraphT}{G^\text{T}}\) \(%\newcommand{\VertsT}{V^\textup{T}}\) \(\newcommand{\EdgesT}{E^\text{T}}\) \(%\) \(% NP-completeness etc\) \(%\) \(\newcommand{\Poly}{\textbf{P}}\) \(\newcommand{\NP}{\textbf{NP}}\) \(\newcommand{\PSPACE}{\textbf{PSPACE}}\) \(\newcommand{\EXPTIME}{\textbf{EXPTIME}}\)
\(\newcommand{\PQInsertF}{\textsf{insert}}\) \(\newcommand{\PQInsert}[1]{\PQInsertF(#1)}\) \(\newcommand{\PQMaxF}{\textsf{max}}\) \(\newcommand{\PQMax}{\PQMaxF()}\) \(\newcommand{\PQRemoveF}{\textsf{remove-max}}\) \(\newcommand{\PQRemove}{\PQRemoveF()}\) \(\newcommand{\PQIncreaseF}{\textsf{increase-priority}}\) \(\newcommand{\PQIncrease}[2]{\PQIncreaseF(#1,#2)}\) \(\newcommand{\PQDecreaseF}{\textsf{decrease-priority}}\) \(\newcommand{\PQDecrease}[2]{\PQDecreaseF(#1,#2)}\) \(\renewcommand{\UFMS}[1]{\textsf{make-set}(#1)}\) \(\renewcommand{\UFFS}[1]{\textsf{find-set}(#1)}\) \(\renewcommand{\UFCompress}[1]{\textsf{find-and-compress}(#1)}\) \(\renewcommand{\UFUnion}[2]{\textsf{union}(#1,#2)}\) \(\newcommand{\UFParent}[1]{{#1}.\textsf{parent}}\) \(\newcommand{\UFNil}{\textsf{nil}}\) \(\newcommand{\UFRank}[1]{{#1}.\textsf{rank}}\) \(\newcommand{\PParam}{s}\)
\(\newcommand{\Graph}{G}\) \(\newcommand{\Verts}{V}\) \(\newcommand{\Vtx}{v}\) \(\newcommand{\Edges}{E}\) \(\newcommand{\Path}[1]{(#1)}\) \(\newcommand{\GColor}[1]{{#1}.\textrm{color}}\) \(\newcommand{\GWhite}{\textbf{white}}\) \(\newcommand{\GGray}{\textbf{gray}}\) \(\newcommand{\GBlack}{\textbf{black}}\) \(\newcommand{\GDist}[1]{{#1}.\text{dist}}\) \(\newcommand{\GPred}[1]{{#1}.\text{pred}}\) \(\newcommand{\GNil}{\textbf{nil}}\) \(\newcommand{\GTime}{\textit{time}}\) \(\newcommand{\GStart}[1]{{#1}.\text{start}}\) \(\newcommand{\GFinish}[1]{{#1}.\text{finish}}\) \(\newcommand{\GraphP}{G_\text{pred}}\) \(\newcommand{\EdgesP}{E_\text{pred}}\)
\(\newcommand{\SPDist}[2]{\delta(#1,#2)}\) \(\newcommand{\SPEst}[1]{{#1}.\textrm{dist}}\) \(\newcommand{\SPPred}[1]{{#1}.\textrm{pred}}\) \(\newcommand{\SPNil}{\textbf{nil}}\) \(\newcommand{\SPSrc}{s}\) \(\newcommand{\SPTgt}{t}\) \(\newcommand{\Weights}{w}\) \(\newcommand{\Weight}[2]{w(#1,#2)}\) \(\newcommand{\UWeights}{w}\) \(\newcommand{\UWeight}[2]{w(\Set{#1,#2})}\) \(\newcommand{\MSTCand}{A}\) \(%\newcommand{\UFMS}[1]{\textsc{make-set}(#1)}\) \(%\newcommand{\UFFS}[1]{\textsc{find-set}(#1)}\) \(%\newcommand{\UFUnion}[2]{\textsc{union}(#1,#2)}\) \(\newcommand{\MSTParent}[1]{{#1}.\textsf{parent}}\) \(\newcommand{\MSTNil}{\textbf{nil}}\) \(\newcommand{\MSTKey}[1]{{#1}.\textsf{key}}\) \(\newcommand{\MSTRoot}{\mathit{root}}\)

Minimum spanning trees

Consider a connected, undirected, and edge-weighted graph \(\Graph = \Tuple{\Verts,\Edges,\UWeights}\). A spanning tree of \(\Graph\) is a connected acyclic graph \(T = \Tuple{\Verts,\MSTCand,\UWeights}\) with \(\MSTCand \subseteq \Edges\). Its weight is the sum of its edge weights \(\sum_{\Set{u,v} \in \MSTCand}\UWeight{u}{v}\). A spanning tree of \(\Graph\) is a minimum spanning tree (MST) of \(\Graph\) if its weight is the smallest among all spanning trees of \(\Graph\). Observe that a graph can have many minimum spanning trees.

Example

Consider the graph shown below.

_images/mst-ex1-graph.png

The highlighted edges in the two graphs below show two spanning trees for the graph above. The first one has weight 31 and is not minimum. The second one is a minimum spanning tree with weight 30.

_images/mst-ex1-st.png _images/mst-ex1-mst.png

As an application example, if the graph models an electronic circuit board so that

  • ​ the vertices are the components and

  • ​ the edges are the possible routes for the power/clock wires between the components,

then a minimum spanning tree of the graph gives a way to connect all the components with the power/clock wire by using the minimum amount of wire.

Note that we only consider connected graphs: for graphs with many components, we could compute a minimum spanning trees for each component individually. And we only consider edge-weighted undirected graphs; for directed graphs the analog concept is called “spanning arborescence of minimum weight” or “optimal branching”, see, for instance, the Wikipedia page on Edmond’s algorithm. These are not included in the course.

A generic greedy approach

Given a graph, it may have an exponential amount of spanning trees. Thus enumerating over all of them in order to find a minimum one is not a feasible solution. But there are very fast algorithms solving the problem:

  • Kruskal’s algorithm

  • Prim’s algorithm

Both are greedy algorithms (recall Section Greedy Algorithms). They build an MST by starting from an empty set and then adding edges greedily one-by-one until the edge set is a tree spanning the whole graph. During the construction of this edge set, call it \(\MSTCand\), both maintain the following invariant:

Before each iteration of the main loop, \(\MSTCand\) is a subset of some MST.

To maintain the invariant, the algorithms then select and add a safe edge for \(\MSTCand\): an edge \(\Set{u,v} \notin \MSTCand\) such that \(\MSTCand \cup \Set{\Set{u,v}}\) is also a subset of some MST. Such an edge always exists until \(\MSTCand\) is an MST: as \(\MSTCand\) is a subset of some MST, then there is an edge in that MST but not in \(\MSTCand\) and we can add it to \(\MSTCand\). The greedy heuristics are used to select such a safe edge efficiently. The generic version in pseudo-code:

MST-Generic(\(G = \Tuple{\Verts,\Edges,\Weights}\)): \(\MSTCand \Asgn \emptyset\) // The edges in the MST while \(\MSTCand\) does not form a spanning tree: let \(\Set{u,v}\) be a safe edge for \(\MSTCand\) \(\MSTCand \Asgn \MSTCand \cup \Set{\Set{u,v}}\) return \(\MSTCand\)

We now only need a way to efficiently detect safe edges. For that, we use the following definitions:

  • ​ A cut for a graph \(\Graph = \Tuple{\Verts,\Edges,\Weights}\) is a partition \(\Tuple{S,\Verts \setminus S}\) of \(\Verts\).

  • ​ An edge crosses a cut \(\Tuple{S,\Verts \setminus S}\) if its other endpoint is in \(S\) and other in \(\Verts \setminus S\).

  • ​ An edge is a light edge crossing a cut if its weight is the smallest among those edges that cross the cut.

Example

Consider the graph below. The cut \(\Tuple{\Set{a,e,f},\Set{b,c,d,g}}\) is shown with a dashed red line. The edges \(\Set{a,b}\), \(\Set{b,f}\) and \(\Set{c,e}\) cross the cut. The light edges crossing the cut are \(\Set{b,f}\) and \(\Set{c,e}\).

_images/mst-ex1-cut1.png

The following theorem gives an easy method for the greedy algorithms to select the next safe edge to be included in the MST under construction. Essentially, it says that for any cut that doesn’t split the partial MST under construction, one can choose any crossing edge with the smallest weight to be included in the MST. We need one more definition: a cut \(\Tuple{S,\Verts \setminus S}\) respects a set \(\MSTCand \subseteq \Edges\) of edges if no edge in \(\MSTCand\) crosses the cut.

Theorem

Let \(\Graph = \Tuple{\Verts,\Edges,\Weights}\) be an undirected, connected edge-weighted graph. If \(\MSTCand\) is a subset of an MST for \(\Graph\), \(\Tuple{S,\Verts \setminus S}\) is a cut for \(\Graph\) that respects \(\MSTCand\) and \(\Set{u,v}\) is a light edge crossing \(\Tuple{S,\Verts \setminus S}\), then \(\Set{u,v}\) is a safe edge for \(\MSTCand\).

Example

Again, consider the graph below. The cut \(\Tuple{\Set{a,e,f},\Set{b,c,d,g}}\) is shown with a dashed red line. The edges \(\Set{a,b}\), \(\Set{b,f}\) and \(\Set{c,e}\) cross the cut.

_images/mst-ex1-cut2.png

The light edges crossing the cut are \(\Set{b,f}\) and \(\Set{c,e}\). The edges in \(\MSTCand\), the MST under construction, are highlighted. They are a subset of an MST. The cut respects \(\MSTCand\). By the theorem, either of the light edges can be added to \(\MSTCand\) so that the result is still a subset of an MST.

Proof: (Sketch)

Let \(\MSTCand'\) be the edge set of any MST \(T'\) such that \(\MSTCand \subset \MSTCand'\) holds.

If \(\Set{u,v} \in \MSTCand'\), then it is safe for \(\MSTCand\) and we are done.

Suppose that \(\Set{u,v} \notin \MSTCand'\), \(u \in S\), and \(v \in \Verts \setminus S\). In the MST \(T'\) there is a unique simple path from \(u\) to \(v\) as it is a tree. Let \(y\) be the first vertex in this path that belongs to the set \(\Verts \setminus S\) and let \(x\) be its predecessor in the path still belonging to the set \(S\). The edge \(\Set{x,y}\) does not belong to the set \(\MSTCand\) as the cut respects the edge set. As the edge \(\Set{u,v}\) is a light edge crossing the cut, it holds that \(\UWeight{u}{v} \le \UWeight{x}{y}\) as also \(\Set{x,y}\) crosses the cut. Consider the tree \(T\) obtained by first removing the edge \(\Set{x,y}\) from the tree \(T'\) (this results in two disjoint trees, other including the vertex \(u\) and the other the vertex \(v\)) and then adding the edge \(\Set{u,v}\). The weight of this tree \(T\) is at most the weight of the tree \(T'\); as \(T'\) is minimal, also \(T\) is and thus the edge \(\Set{u,v}\) is safe for \(\MSTCand\).

Kruskal’s algorithm

Kruskal’s algorithm finds the next safe edge to be added by

  • ​ first sorting the edges in non-decreasing order by their weight, and

  • ​ then finding the smallest-weight edge that connects two trees in the current edge set \(\MSTCand\).

To quickly detect whether an edge connects two trees, the vertices in the trees are maintained in a disjoint-sets data structure (recall Section Disjoint-sets with the union-find data structure). When an edge is added, the trees in question merge into a new larger tree. The cut is not explicitly maintained but for an edge \(\Set{u,v}\) connecting two trees involving \(u\) and \(v\) respectively, one can take any cut that puts the tree of \(u\) on one side and the tree of \(v\) to the other. The possible other trees are put on either side.

In pseudo-code:

MST-Kruskal(\(Q = \Tuple{\Verts,\Edges,\Weights}\)): \(\MSTCand \Asgn \emptyset\) for each vertex \(v \in \Verts\): // All vertices are initially in their own unit sets \(\UFMS{v}\) sort the edges \(\Edges\) into nondecreasing order by their weight for each edge \(\Set{u,v} \in \Edges\) in nondecreasing weight-order: if \(\UFFS{u} \neq \UFFS{v}\): // Are \(u\) and \(v\) in different sets? \(\MSTCand \Asgn \MSTCand \cup \Set{\Set{u,v}}\) // Record the merge edge \(\UFUnion{u}{v}\) // Merge the sets return \(\MSTCand\)

With the rank-based disjoint-sets implementation introduced in Section Disjoint-sets with the union-find data structure, one can argue that the worst-case running time of the algorithm is \(\Oh{\NofE \Lg{\NofE}}\) — sorting the edges takes asymptotically the largest amount of time in the algorithm. Recalling that \(\NofE \le \NofV^2\) and thus \(\Lg{\NofE} \le 2 \Lg{\NofV}\), the worst-case bound can be stated also as \(\Oh{\NofE \Lg{\NofV}}\).

Example: Applying Kruskal’s algorithm

Consider the graph shown below. The edges in a weight-sorted order are (edges with the same weight are ordered arbitrarily): \(\Set{e,f}, \Set{b,c}, \Set{b,f}, \Set{c,e}, \Set{c,d}, \Set{a,e}, \Set{b,d}, \Set{a,b}, \Set{a,f}, \Set{d,g}\).

_images/kruskal-ex1-0.png

First, we add the edge \(\Set{e,f}\) of weight 2 and obtain the situation below (the edges in the MST under construction \(\MSTCand\) are highlighted). The cut considered could be \(\Tuple{\Set{e},\Set{a,b,c,d,f,g}}\).

_images/kruskal-ex1-1.png

Next, we add the edge \(\Set{b,c}\) of weight 3 and obtain the situation below. The cut considered could be \(\Tuple{\Set{b},\Set{a,c,d,e,f,g}}\).

_images/kruskal-ex1-2.png

Next, we add the edge \(\Set{b,f}\) of weight 4 and obtain the situation below. The cut considered could be \(\Tuple{\Set{e,f},\Set{a,b,c,d,g}}\).

_images/kruskal-ex1-3.png

The next edge \(\Set{c,e}\) is skipped because the vertices \(c\) and \(e\) already belong to the same tree. After this, we add the edge \(\Set{c,d}\) of weight 6 and obtain the situation below. The cut considered could be \(\Tuple{\Set{b,c,e,f},\Set{a,d,g}}\).

_images/kruskal-ex1-4.png

Next, we add the edge \(\Set{a,e}\) of weight 6 and obtain the situation below. The cut considered could be \(\Tuple{\Set{a},\Set{b,c,d,e,f,g}}\).

_images/kruskal-ex1-5.png

The next edges \(\Set{b,d}\), \(\Set{a,b}\), and \(\Set{a,f}\) are skipped because their end-points already belong to the same tree. Finally, we add the edge \(\Set{d,g}\) of weight 9 and obtain the situation below. The cut considered could be \(\Tuple{\Set{a,b,c,d,e,f},\Set{g}}\).

_images/kruskal-ex1-6.png

The result shown above is an MST of weight 30.

Prim’s algorithm

Prim’s algorithm also implements the generic greedy MST finding algorithm but selects the next safe edge in a different way. It maintains the cut explicitly:

  • ​ on the other half is the whole MST constructed so far, while

  • ​ the other half are the vertices not yet joined to the MST.

The process starts from an arbitrary root vertex. Then, at each step, an arbitrary light edge from the first half to the second half is selected and thus one vertex is moved from the second half to the first. To select a light edge quickly, each vertex in the second half maintains, as attributes, the weight (referred to as “key” in the pseudo-code) and the other end of a smallest weight edge (“parent” in the pseudo-code) coming to it from the first half. All the vertices in the second half are stored in a minimum priority queue \(Q\) (recall Section Priority queues with binary heaps) with the weight acting as the priority key. In this way, it is efficient to find the vertex with the smallest key: the edge from the first half to it must thus be a light edge. In the beginning, all the weights are \(\infty\) except for the root vertex which has weight 0. Whenever a vertex is moved from the second half to the first (dequeued from the priority queue), its neighbors’ attributes are updated.

In pseudo-code:

MST-Prim(\(G=\Tuple{\Verts,\Edges,\UWeights}\), \(\MSTRoot\)): for each vertex \(\Vtx \in \Verts\): \(\MSTKey{\Vtx} \Asgn \infty\) \(\MSTParent{\Vtx} \Asgn \MSTNil\) \(\MSTKey{\MSTRoot} = 0\) \(Q \Asgn \Verts\) // Min-priority queue (w.r.t. the key) of the vertices not yet in the MST \(\MSTCand \Asgn \emptyset\) // The edges in the MST while \(Q \neq \emptyset\): let \(u\) be a vertex in \(Q\) with the smallest key remove \(u\) from \(Q\) if \(u \neq \MSTRoot\): \(\MSTCand \Asgn \MSTCand \cup \Set{\Set{\MSTParent{u},u}}\) for each vertex \(v\) with an edge \(\Set{u,v} \in \Edges\): if \(v \in Q\) and \(\UWeight{u}{v} < \MSTKey{v}\): \(\MSTParent{v} \Asgn u\) \(\MSTKey{v} \Asgn \UWeight{u}{v}\) return \(\MSTCand\)

Example: Applying Prim’s algorithm

Consider again the graph shown below.

_images/prim-ex1-0.png

The graphs below show the constructed spanning tree (the highlighted edges) at the end of each while-loop execution, green vertices are the ones not in \(Q\). Supposing that the root vertex was \(f\), the situation after the first iteration is shown below. Observe that the weights of the neighbors of \(f\) have been updated. The “parent” attributes are not shown in the graph.

_images/prim-ex1-1.png

Next, the vertex \(e\) with the lowest priority and the edge \(\Set{e,f}\) are added to the MST. The weights of the neighbors of \(e\) are updated.

_images/prim-ex1-2.png

Now the edge \(\Set{b,f}\) is added to the MST. Observe that the edge \(\Set{c,e}\) could have been selected instead and this would result in a different final MST.

_images/prim-ex1-3.png

Next, add the edge \(\Set{b,c}\).

_images/prim-ex1-4.png

Next, add the adage \(\Set{b,d}\).

_images/prim-ex1-5.png

Next, add the age \(\Set{a,e}\).

_images/prim-ex1-6.png

Next, add the age \(\Set{d,g}\).

_images/prim-ex1-7.png

The final result is an MST of wight 30.

Let’s perform a worst-case running time analysis of the algorithm. Assume that priority queues are are implemented with heaps (recall Section Priority queues with binary heaps).

  • ​ The initialization phase, including the priority queue initialization, can be done in \(\Oh{\NofV \Lg{\NofV}}\). It is quite easy to see that in this case it is easy to initialize the priority queue in linear time as well and thus the initialization phase can in fact be done in time \(\Oh{\NofV}\).

  • ​ The while-loop executes \(\NofV\) times as one vertex is removed from \(Q\) in each iteration.

    • ​ Removing a smallest element from the priority queue takes time \(\Oh{\Lg{\NofV}}\)

    • ​ The set \(\MSTCand\) can be implemented as a linked list or an array of \(\NofV-1\) elements and thus adding an edge to it takes constant time.

  • ​ Each edge is considered twice, once for each end point

    • ​ The keys of the vertices are thus updates at most \(2 \NofE\) times, each update taking \(\Oh{\Lg{\NofV}}\) time.

The total running time is thus \(\Oh{\NofV + \NofV \Lg{\NofV} + \NofE\Lg{\NofV}}\). This equals to

\[\Oh{\NofE\Lg{\NofV}}\]

as \(\NofE \ge \NofV - 1\) for connected graphs.

A small side track: Steiner trees

As shown above, we have very efficient algorithms for finding minimum spanning trees. But if we make a rather small modification in the problem definition, we obtain a problem for which we do not know any polynomial time algorithms at the moment. In fact, many believe (but we cannot yet prove this) that such algorithms do not exist. Consider the problem of finding minimum Steiner trees instead of minimum spanning trees.

Definition: Minimum Steiner tree problem

Given a connected, edge-weighted undirected graph \(\Graph=\Tuple{\Verts,\Edges,\UWeights}\) and a subset \(S \subseteq \Verts\) of terminal vertices, find an acyclic sub-graph \(\Graph=\Tuple{\Verts,\MSTCand,\Weights}\) with \(\MSTCand \subseteq \Edges\) that (i) connects all the terminal vertices to each other with some paths and (ii) minimizes the edge-weight sum \(\sum_{\Set{u,v}\in\MSTCand}\UWeight{u}{w}\).

That is, the tree does not have to span all the vertices but only the terminals and some of the other needed to connect the terminals to each other.

Example

Consider the graph on the left below with the terminal vertices \(\Set{a,b,h,k}\) shown in green.

_images/steiner-ex1-graph.png

The following is a Steiner tree of weight 15.

_images/steiner-ex1-steiner.png

The graph below shows, in order to illustrate the difference in the problem definition, a minimum spanning tree for the graph.

_images/steiner-ex1-mst.png

As an application example, we could extend our earlier circuit board example. Now

  • ​ the terminal vertices are the components,

  • ​ the other vertices are crossing positions on the board through which we can draw the power/clock wires, and

  • ​ the edges are the possible connections between the components and crossings.

We now want to find a minimum Steiner tree instead of a minimum spanning tree as we do not need to connect power/clock wire to each of the crossings, only the components need to be connected to each other.

Finding minimum Steiner trees is one example of an NP-hard problem. For such problems, we do not know algorithms that would have polynomial worst-case running time. There are algorithms that solve the minimum Steiner tree problem in time that is (i) exponential in the size of the terminal set \(S\) but (ii) polynomial in the size of the graph. Thus the problem is “fixed-parameter tractable”; see, for instance, this slide set. These algorithms and concepts are not included in the course exam.