企业🤖AI Agent构建引擎,智能编排和调试,一键部署,支持私有化部署方案 广告
## 4.1 Minimum Spanning Trees Suppose an urban planner wants to connect certain cities with roads so that it is possible for someone to drive from any city to any other city. If there are budgetary restrictions, the planner may want to do this with the minimum amount of road. We will develop an algorithm that solves this and similar problems. First, let's informally review more graph theory. [Figure 4.3(a)](#ch04fig03) shows a connected, weighted, undirected graph *G*. We assume here that the weights are nonnegative numbers. The graph is ***undirected*** because the edges do not have direction. This is represented pictorially by an absence of arrows on the edges. Because the edges do not have direction, we say an edge is *between* two vertices. A ***path*** in an undirected graph is a sequence of vertices such that there is an edge between each vertex and its successor. Because the edges have no direction, there is a path from vertex *u* to vertex *v* if and only if there is a path from *v* to *u*. Therefore, for undirected graphs we simply say that there is a *path* between two vertices. An undirected graph is called ***connected*** if there is a path between every pair of vertices. All the graphs in [Figure 4.3](#ch04fig03) are connected. If we removed the edge between *v*2 and *v*4 from the graph in [Figure 4.3(b)](#ch04fig03), the graph would no longer be connected. [![Click To expand](https://box.kancloud.cn/fde5874d75b10038b05cdc46a30637e6_350x407.jpg)](fig4-3_0.jpg) Figure 4.3: A weighted graph and three subgraphs. In an undirected graph, a path from a vertex to itself, which contains at least three vertices and in which all intermediate vertices are distinct, is called a ***simple cycle.*** An undirected graph with no simple cycles is called ***acyclic***. The graphs in [Figure 4.3(c)](#ch04fig03) and [Figure 4.3(d)](#ch04fig03) are acyclic, whereas the ones in [Figure 4.3(a)](#ch04fig03) and [Figure 4.3(b)](#ch04fig03) are not. A ***tree*** (technically, a *free tree*) is an acyclic, connected, undirected graph. The graph in [Figure 4.3(c)](#ch04fig03) and [Figure 4.3(d)](#ch04fig03) are trees. With this definition, no vertex is singled out as the root, and a ***rooted tree*** is defined as a tree with one vertex designated as the root. Therefore, a rooted tree is what is often called a *tree*(as was done in [Section 3.5](LiB0031.html#354)). Consider the problem of removing edges from a connected, weighted, undirected graph *G* to form a subgraph such that all the vertices remain connected and the sum of the weights on the remaining edges is as small as possible. Such a problem has numerous applications. As mentioned earlier, in road construction we may want to connect a set of cities with a minimum amount of road. Similarly, in telecommunications we may want to use a minimal length of cable, and in plumbing we may want to use a minimal amount of pipe. A subgraph with minimum weight must be a tree, because if a subgraph were not a tree, if would contain a simple cycle, and we could remove any edge on the cycle, resulting in a connected graph with a smaller weight. To illustrate this, look at [Figure 4.3](#ch04fig03). The subgraph in [Figure 4.3(b)](#ch04fig03) of the graph in [Figure 4.3(a)](#ch04fig03) cannot have minimum weight because if we remove any edge on the simple cycle \[*v*3, *v*4, *v*5, *v*3\], the subgraph remains connected. For example, we could remove the edge connecting *v*4 and *v*5, resulting in a connected graph with a smaller weight. A ***spanning tree*** for *G* is a connected subgraph that contains all the vertices in *G* and is a tree. The trees in [Figure 4.3(c)](#ch04fig03) and [Figure 4.3(d)](#ch04fig03) are spanning trees for *G*. A connected subgraph of minimum weight must be a spanning tree, but not every spanning tree has minimum weight. For example, the spanning tree in [Figure 4.3(c)](#ch04fig03) does not have minimum weight, because the spanning tree in [Figure 4.3(d)](#ch04fig03) has a lesser weight. An algorithm for our problem must obtain a spanning tree of minimum weight. Such a tree is called a ***minimum spanning tree***. The tree in [Figure 4.3](#ch04fig03)(d) is a minimum spanning tree for *G.* A graph can have more than one minimum spanning tree. There is another one for *G*, which you may wish to find. To find a minimum spanning tree by the brute-force method of considering all spanning trees is worse than exponential in the worst case. We will solve the problem more efficiently using the greedy approach. First we need the formal definition of an undirected graph. Definition An ***undirected graph*** *G* consists of a finite set *V* whose members are called the vertices of *G*, together with a set *E* of pairs of vertices in *V*. These pairs are called the edges of *G*. We denote G by ![](https://box.kancloud.cn/0d57e288c0cb9356084b4b73999c7d41_96x22.jpg) We will denote members of *V* by *v**i* and the edge between (*v**i*, and *v**j* by ![](https://box.kancloud.cn/5b03ce9f9eecd2a6e0d6d7b512f6a51a_63x23.jpg) Example 4.1**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**For the graph in [Figure 4.3(a)](#ch04fig03), ![](https://box.kancloud.cn/aa01014b6d55807aaf7181dd32bc2497_369x76.jpg) The order in which we list the vertices to denote an edge is irrelevant in an undirected graph. For example, (*v*1, *v*2) denotes the same edge as (*v*2, *v*1). We have listed the vertex with the lower index first. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** A spanning tree *T* for *G* has the same vertices *V* as *G*, but the set of edges of *T* is a subset *F* of *E*. We will denote a spanning tree by *T* = (*V*, *F*. Our problem is to find a subset *F* of *E* such that *T* = (*V*, *F*) is a minimum spanning tree for *G*. A high-level greedy algorithm for the problem could proceed as follows: ``` F = Ø // Initialize set of // edges to empty. while (the instance is not solved){ select an edge according to some locally optimal consideration; // selection procedure if (adding the edge to F does not create a cycle) add it; // feasibility check if (T = (V, F) is a spanning tree) // solution check the instance is solved; } ``` This algorithm simply says "select an edge according to some locally optimal consideration." There is no unique locally optimal property for a given problem. We will investigate two different greedy algorithms for this problem, Prim's algorithm and Kruskal's algorithm. Each uses a different locally optimal property. Recall that there is no guarantee that a given greedy algorithm always yields an optimal solution. One must prove whether or not this is the case. We will prove that both Prim's and Kruskal's algorithms always produce minimum spanning trees. ### 4.1.1 Prim's Algorithm Prim's algorithm starts with an empty subset of edges *F* and a subset of vertices *Y* initialized to contain an arbitrary vertex. We will initialize *Y* to {*v*1}. A vertex *nearest* to *Y* is a vertex in *V* – *Y* that is connected to a vertex in *Y* by an edge of minimum weight. (Recall from [Chapter 3](LiB0024.html#252) that weight and distance terminology are used interchangeably for weighted graphs.) In [Figure 4.3(a)](#ch04fig03), *v*2 is nearest to *Y* when *Y* = {*v*1}. The vertex that is nearest to *Y* is added to *Y* and the edge is added to *F*. Ties are broken arbitrarily. In this case, *v*2 is added to *Y*, and (*v*1, *v*2) is added to *F*. This process of adding nearest vertices is repeated until *Y* = *V*. The following is a high-level algorithm for this procedure: ``` F = Ø // Initialize set of edges // to empty. Y = {v1} // Initialize set of vertices to // contain only the first one. while (the instance is not solved){ select a vertex in V - Y that is // selection procedure and nearest to Y // feasibility check add the vertex to Y; add the edge to F; if (Y == V) // solution check the instance is solved; } ``` The selection procedure and feasibility check are done together because taking the new vertex from *V* - *Y* guarantees that a cycly is not created. [Figure 4.4](#ch04fig04) illustrates Prim's algorithm. At each step in that figure, *Y* contains the shaded vertices and *F* contains the shaded edges. [![Click To expand](https://box.kancloud.cn/649463c2bf4f414e8740975c90df00ab_274x500.jpg)](fig4-4_0.jpg) Figure 4.4: A weighted graph (in upper-left corner) and the steps in Prim's algorithm for that graph. The vertices in *Y* and the edges if *F* are shaded at each step. The high-level algorithm works fine for a human creating a minimum spanning tree for a small graph from a picture of the graph. The human merely finds the vertex nearest to *Y* by inspection. However, for the purposes of writing an algorithm that can be implemented in a computer language, we need to describe a step-by-step procedure. To this end, we represent a weighted graph by its adjacency matrix. That is, we represent it by an *n* x *n* array *W* of numbers where ![](https://box.kancloud.cn/055287c859a4fb6b0bc5bfb96e9ca935_400x58.jpg) The graph in [Figure 4.3](#ch04fig03)(a) is represented in this manner in [Figure 4.5](#ch04fig05). We maintain two arrays, *nearest* and *distance*, where, for *i* = 2,…, *n*, ![](https://box.kancloud.cn/372a66e50d6c316252a50311a42f5004_400x56.jpg) [![Click To expand](https://box.kancloud.cn/f3f9c0762d42d91affb95c86b9eeee2b_279x283.jpg)](fig4-5_0.jpg) Figure 4.5: The array representation *W* of the graph in [Figure 4.3](#ch04fig03)(a). Because at the start *Y* = {*v*1}, *nearest*\[*i*\] is initialized to 1 and *distance*\[*i*\] is initialized to the weight on the edge between *v*1 and *v**i*. As vertices are added to Y, these two arrays are updated to reference the new vertex in *Y* nearest to each vertex outside of *Y*. To determine which vertex to add to *Y*, in each interation we compute the index for which *distance*\[*i*\] is the smallest. We call this index *vnear*. The vertex indexed by *vnear* is added to *Y* by setting *distance*\[*vnear* to −1. The following algorithmimplements this procedure. Algorithm 4.1: Prim's Algorithm**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**Problem: Determine a minimum spanning tree. Inputs: integer *n* ≥ 2, and a connected, weighted, undirected graph containing *n* vertices. The graph is represented by a two-dimensional array *W*, which has both its rows and columns indexed from 1 to *n*, where *W*\[*i*\]\[*j*\] is the weight on the edge between the *i*th vertex and the *j*th vertex. Outputs: set of edges *F* in a minimum spanning tree for the graph. ``` void prim (int n, const number W[] [], set_of_edges& F) { index i, vnear; number min; edge e; index nearest [2 . . n]; number distance [2 . . n]; F = Ø; for (i = 2; i <= n; i++){ nearest [i] = 1; // For all vertices, initialize v1distance [i] = W[1] [i] ; // to be the nearest vertex in } // Y and initialize the distance // from Y to be the weight // on the edge to v1. repeat (n - 1 times{ // Add all n - 1 vertices to Y. min = ∞ for (i = 2; i <= n; i++) // Check each vertex for if (0 ≤ distance [i] < min) { // being nearest to Y. min = distance [i]; vnear = i; } e = edge connecting vertices indexed by vnear and nearest [vnear]; add e to F; distance [vnear] = - 1; // Add vertex indexed by for (i = 2; i <= n; i++) // vnear to Y. if (W[i] [vnear] < distance [i]){ // For each vertex not in distance = W[i] [vnear]; // Y, update its distance nearest [i] = vnear; // from Y. } } } ``` **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** **![Start Sidebar](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**Analysis of Algorithm 4.1 Every-Case Time Complexity (Prim's Algorithm)Basic operation: There are two loops, each with *n* − 1 iterations, inside the **repeat** loop. Executing the instructions inside each of them can be considered to be doing the basic operation once. Input size: *n*, the number of vertices. Because the **repeat** loop has *n* - 1 iterations, the time complexity is given by > *T*(*n*) = 2 (*n* - 1) (*n* - 1) ∊ Θ (*n*2). **![End Sidebar](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** Clearly, Prim's algorithm produces a spanning tree. However, is it necessarily minimal? Because at each step we select the vertex nearest to *Y*, intuitively it seems that the tree should be minimal. However, we need to prove whether or not this is the case. Although greedy algorithms are often easier to develop than dynamic programming algorithms, usually it is more difficult to determine whether or not a greedy algorithms always produces and optimal solution. Recall that for a dynamic programming algorithm we need only show that the principle of optimality applies. For a greedy algorithm we usually need a formal proof. Next we give such a proof for Prim's algorithm. Let an undirected graph *G* = (*V, E*) be given. A subset *F* and *E* is called ***promising*** if edges can be added to it so as to form a minimum spanning tree. The subset {(*v*1, *v*2), (*v*1, *v*3)}, in [Figure 4.3(a)](#ch04fig03) is promising, and the subset {(*v*2, *v*4)} is not promising. Lemma 4.1**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**![](https://box.kancloud.cn/485bc67a7beadb016ac9d44f235f6e03_27x27.jpg) Let *G* = (*V, E*) be a connected, weighted, undirected graph; let *F* be a promising subset of *E*; and let *Y* be the set of vertices connected by the edges in *F*. If *e* is an edge of minimum weight that connects a vertex in *Y* to a vertex in *V - Y*, then *F* ∪ {*e*} is promising. Proof: Because *F* is promising, there must be some set of edges *F*′ such that ![](https://box.kancloud.cn/5c4c664309dfcd8717c4bf7cab419220_59x21.jpg) and (*V*, *F*′) is a minimum spanning tree. If *e* ∊ *F*′, then ![](https://box.kancloud.cn/73813655f675454e052a9fce677bec1c_111x23.jpg) which means *F* ∪ {*e*} is promising and we're done. Otherwise, because (*V*, *F*′) is a spanning tree, *F*′ ∪ {*e*} must contain exactly one simple cycle and *e* must be in the cycle. [Figure 4.6](#ch04fig06) illustrates this. The simple cycle is \[*v*1, *v*2, *v*4, *v*3\]. As can be seen in [Figure 4.6](#ch04fig06), there must be another edge *e*′∊ *F*′ in the simple cycle that also connects a vertex in *Y* to one in *V* - *Y*. If we remove *e*′ from *F*′ ∪ {*e*}, the simple cycle disappears, which means that we have a spanning tree. Because *e* is an edge of minimum weight that connects a vertex in *Y* to one in *V - Y*, the weight of *e* must be less than or equal to the weight of *e*′ (in fact, they must be equal). So ![](https://box.kancloud.cn/95c3efe3a20f95e25e8cd1fe2992c387_121x23.jpg) is a minimum spanning tree. Now ![](https://box.kancloud.cn/f9df835aa64c7ecbee2cbe190f80cf2f_213x23.jpg) because *e*′ cannot be in *F* (recall that edges in *F* connect only vertices in *Y*.) Therefore, *F* ∪ {*e*} is promising, which completes the proof. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** [![Click To expand](https://box.kancloud.cn/a9f1578bae51ea5305675bcbdedd5d14_346x500.jpg)](fig4-6_0.jpg) Figure 4.6: A graph illustrating the proof in [Lemma 4.1](#ch04ex03). The edges in *F*′ are shaded in color. Theorem 4.1**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**Prim's algorithm always produces a minimum spanning tree. Proof: We use induction to show that the set *F* is promising after each iteration of the **repeat** loop. Induction base: Clearly the empty set Ø is promising. Induction hypothesis: Assume that, after a given iteration of the **repeat** loop, the set of edges so far selected—namely, *F*—is promising. Induction step: We need to show that the set *F* ∪ {*e*} is promising, where *e* is the edge selected in the next iterartion. Because the edge *e* selected in the next iteration is an edge of minimum weight that connects a vertex in *Y* to one on *V* - *Y*, *F* ∪ {*e*} is promising, by [Lemma 4.1](#ch04ex03). This completes the induction proof. By the induction proof, the final set of edges is promising. Because this set consists of the edges in a spanning tree, that tree must be a minimum spanning tree. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** ### 4.1.2 Kruskal's Algorithm Kruskal's algorithm for the Minimum Spanning Tree problem starts by creating disjoint subsets of *V*, one for each vertex and containing only that vertex. It then inspects the edges according to nondecreasing weight (ties are broken arbirtrarily). If an edge connects two vertices in disjoint subsets, the edge is added and the subsets are merged into one set. This process is repeated until all the subsets are merged into one set. The following is a high-level algorithm for this procedure. ``` F = Ø // Initialize set of // edges to empty. create disjoint subsets of V, one for each vertex and containing only that vertex; sort the edges in E in nondecreasing order; while (the instance is not solved){ select next edge; // selection procedure if (the edge connects two vertices in // feasibility check disjoint subsets){ merge the subsets; add the edge to F; } if ( all the subsets are merged) // solution check the instance is solved; } ``` [Figure 4.7](#ch04fig07) illustrates Kruskal's algorithm. [![Click To expand](https://box.kancloud.cn/70575bf4d0373f17e236b69247f1ac31_350x422.jpg)](fig4-7_0.jpg) Figure 4.7: A weighted graph (in upper-left corner) and the steps in Kruskal's algorithm for that graph. To write a formal version of Kruskal's algorithm, we need a disjoint set abstract data type. Such a data type is implmented in [Appendix C](LiB0108.html#1464): Because that implementation is for disjoint subsets of indices, we need only refer to the vertices by index to use the implementation. The disjoint set abstract data type consists of data type **index** and **set\_pointer**, and routines *initial, find, merge*, and *equal*, such that if we declare ``` index i; set_pointer p, q; ``` then - *initial*(*n*) initializes *n* disjoint subsets, each of which contains exactly one of the indices between 1 and *n*. - *p* = *find*(*i*) makes *p* point to the set containing index *i*. - *merge*(*p*, *q*) merges the two sets, to which *p* and *q* point, into the set, the set. - *equal*(*p*, *q*) returns true if *p* and *q* both point to the same set. The algorithm follows. Algorithm 4.2: Kruskal's Algorithm**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**Problem: Determine a minimum spanning tree. Inputs: integer *n* ≥ 2, positive integer *m*, and a connected, weighted, undirected graph containing *n* vertices and *m* edges. The graph is represented by a set *E* that contains the edges in the graph along with their weights. Outputs: *F*, a set of edges in a minimum spanning tree. ``` void kruskal (int n, int m, set_of_edges E, set_of_edges& F) { index i, j; set_pointer p, q; edge e; ``` Sort the *m* edges in *E* by weight in nondecreasing order; ``` F = Ø; initial (n); // Initialize n disjoint subsets. while (number of edges in F is less than n - 1){ e = edge with least weight not yet considered; i, j = indices of vertices connected by e; p = find(i); q = find(j); if (! equal(p, q)){ merge(p, q); add e to F; } } } ``` **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** The **while** loop is exited when there are *n* − 1 edges in *F*, because there are *n* − 1 edges in a spanning tree. **![Start Sidebar](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**Analysis of Algorithm 4.2 Worst-case Time-Complexity (Krukal's Algorithm)Basic operation: a comparison instruction. Input size: *n*, the number of vertices, and *m*, the number of edges. There are three considerations in this algorithm: 1. The time of sort the edges. We obtained a sorting algorithm in [Chapter 2](LiB0014.html#141)(Mergesort) that is worst-case Θ (*m* lg *m*). In [Chapter 7](LiB0052.html#627) we will show, for algorithms that sort by comparison of keys, that it is not possible to improve on this performance. Therefore, the time complexity for sorting the edges is given by ![](https://box.kancloud.cn/9afbd45443dc31a0ebd3df30873a5ffa_167x22.jpg) 2. The time in the **while** loop. The time it takes to manipulate the disjoint sets is dominant in this loop (because everything else is constant). In the worst case, every edge is considered before the **while** loop is exited, which means there are *m* passes through the loop. Using the implementation called Disjoint Set Data Structure II in [Appendix C](LiB0108.html#1464), the time complexity for *m* passe through a loop that contains a constant number of calls to routines *find, equal,* and *merge* is given by ![](https://box.kancloud.cn/80adae1ad1844d5de41949dd75586112_168x21.jpg) where the basic operation is a comparison instruction. 3. The time to initiallize *n* disjoint sets. Using the disjoint set data structure implementation mentioned previously, the time complexity for the initialization is given by ![](https://box.kancloud.cn/8a8d55720aa865c5df981686598c6f81_116x22.jpg) Because *m* ≥ n - 1, the sorting and the manipulations of the disjoint sets dominate the initialization time, which means that ![](https://box.kancloud.cn/727437be435e4323069d3bdb18645559_185x22.jpg) It may appear that the worst case has no dependence on *n*. However, in the worst case every vertex can be connected to every other vertex, which would mean that ![](https://box.kancloud.cn/285206f168af5f8a1eeb23c4ffd12b3f_198x42.jpg) Therefore, we can also write the worst case as follows: ![](https://box.kancloud.cn/b291617ae0d0cdd480692fd1f6c8a709_400x26.jpg) It is useful to use both expressions for the worst case when comparing Kruskal's algorithm with Prim's algorithm. **![End Sidebar](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** We need the following leema to prove that Kruskal's algorithm always produces an optimal solution. Lemma 4.2**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**![](https://box.kancloud.cn/485bc67a7beadb016ac9d44f235f6e03_27x27.jpg) Let *G* = (*V,E*) be a connected, weighted, undirected graph; let *F* be a promising subset of *E*; and let *e* be an edge of minimum weight in *E* – *F* such that *F* ∪ {*e*} has no simple cycles. Then *F* ∪ {*e*} is promising. Proof: The proof is similar to the proof of [Lemma 4.1](#ch04ex03). Because *F* is promising, there must be some set of edges *F*′ such that ![](https://box.kancloud.cn/05ff9c0086d340fe21c88015573abc51_60x20.jpg) and (*V, F*′) is a minimum spanning tree. If *e* ∪ *F*′, then ![](https://box.kancloud.cn/58b5e3bb4fe21120754ef5fe42b48e21_111x23.jpg) which means that *F* ∪ {*e*} is promising and we're done. Otherwise, because (*V,F*′) is a spanning tree, *F*′∪ {*e*} must contain exactly one simple cycle and *e* must be in the cycle. Because *F*∪{*e*} contains no simple cycles, there must be some edge *e*′ ∪ *F*′ that is in the cycle and that is not in *F*. That is, *e*′ ∪ *E - F*. The set *F* ∪ {*e*′} has no simple cycles because it is a subset of *F*′. Therefore, the weight of *e* is no greater than the weight of *e*′ (Recall that we assumed *e* is an edge of minimum weight in *E - F* such that *F* ∪ {*e*} has no cycles.) If we remove *e*′ from *F* ∪ {*e*}, the simple cycle in this set disappears, which means we have a spanning tree. Indeed ![](https://box.kancloud.cn/06927d2f56a573e1996de212108671f3_121x23.jpg) is a minimum spanning tree because, as we have shown, the weight of *e* is no greater than the weight of *e*′. Because *e*′ is not in *F*, ![](https://box.kancloud.cn/3c8855f37827f8990eb424a9a0685200_214x23.jpg) Therefore, *F* ∪ {*e*} is promising, which completes the proof. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** Theorem 4.2**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**Kruskal's algorithm always produces a minimum spanning tree. Proof: The proof is by induction, starting with the empty set of edges. You are asked to apply [Lemma 4.2](#ch04ex06) to complete the proof in the exercises. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** ### 4.1.3 Comparing Prim's Algorithm with Kruskal's Algorithm We obtained the following time complexities: Prim's Algorithm: *T*(*n*) ∊ θ(*n*2) Kruskal's Algorithm: *W* (*m, n*) ∊ θ(*m* lg *m*) and *W* (*m, n*) ∊ θ (*n*2 lg *n*) We also showed that in a connected graph ![](https://box.kancloud.cn/75fd04bef604bb6f6b470a446cd2007d_189x42.jpg) For a graph whose number of edges *m* is near the low end of these limits (the graph is very sparse), Kruskal's algorithm is θ(*n* lg *n*), which means that Kruskal's algorithm should be faster. However, for a graph whose number of edges is near the high end (the graph is highly connected), Kruskal's algorithm is θ (*n*2 lg *n*), which means that Prim's algorithm should be faster. ### 4.1.4 Final Discussion As mentioned before, the time complexity of an algorithm sometimes depends on the data structure used to implement it. Using heaps, Johnson (1977) created a θ *m* lg *n* implementation of Prim's algortithm. For a sparse graph, this is θ (*n* lg *n*), which is an improvement over our implementation. But for a dense graph, it is θ (*n*2 lg *n*), which is slower than our implementation. Using the Fibonacci heap, Fredman and Tarjan (1987) developed the fastest implementation of Prim's algorithm. Their implementation is θ(*m* + *n* lg *n*). For a sparse graph, this is θ (*n* lg *n*), and for a dense graph it is θ (*n*2). Prim's algorithm originally appeared in Jarník (1930) and was published by its namesake in Prim (1957). Kruskal's algorithm is from Kruskal (1956). The history of the Minimum Spanning Tree problem is discussed in Graham and Hell (1985). Other algorithms for the problem can he found in Yao (1975) and Tarjan (1983).