💎一站式轻松地调用各大LLM模型接口,支持GPT4、智谱、星火、月之暗面及文生图 广告
## 1.4 Order We just illustrated that an algorithm with a time complexity of *n* is more efficient than one with a time complexity of *n*2 for sufficiently large values of *n*, regardless of how long it takes to process the basic operations in the two algorithms. Suppose now that we have two algorithms for the same problem and that their every-case time complexities are 100*n* for the first algorithm and 0.01*n*2 for the second algorithm. Using an argument such as the one just given, we can show that the first algorithm will eventually be more efficient than the second one. For example, if it takes the same amount of time to process the basic operations in both algorithms and the overhead is about the same, the first algorithm will be more efficient if ![](https://box.kancloud.cn/4229ff92f58bd5eed2ecbcb178feb8d7_119x20.jpg) Dividing both sides by 0.01*n* yields ![](https://box.kancloud.cn/c3ad1d6271e8ff328a3b0738d813eec6_92x18.jpg) If it takes longer to process the basic operation in the first algorithm than in the second, then there is simply some larger value of *n* at which the first algorithm becomes more efficient. Algorithms with time complexities such as *n* and 100*n* are called ***lineartime algorithms*** because their time complexities are linear in the input size *n*, whereas algorithms with time complexities such as *n*2 and O.01*n*2 are called ***quadratic-time algorithms*** because their time complexities are quadratic in the input size *n*. There is a fundamental principle here. That is, *any linear-time algorithm is eventually more efficient than any quadratic-time algorithm.* In the theoretical analysis of an algorithm, we are interested in eventual behavior. Next we will show how algorithms can be grouped according to their eventual behavior. In this way we can readily determine whether one algorithm's eventual behavior is better than another's. ### 1.4.1 An Intuitive Introduction to Order Functions such as 5*n*2 and 5*n*2 + 100 are called ***pure quadratic*** functions because they contain no linear term, whereas a function such as 0. 1*n*2 + *n* + 100 is called a ***complete quadratic*** function becanse it contains a linear term. [Table 1.3](#ch01table03) shows that eventually the quadratic term dominates this function. That is, the values of the other terms eventually become insignificant compared with the value of the quadratic term. Therefore, although the function is not a pure quadratic function, we can classify it with the pure quadratic functions. This means that if some algorithm has this time complexity, we can call the algorithm a quadratic-time algorithm. Intuitively, it seems that we should always be able to throw away low-order terms when classifying complexity functions. For example, it seems that we should be able to classify 0.1*n*3 + 10*n*2 + 5*n* + 25 with pure cubic functions. We will soon establish rigorously that we can do this. First let's try to gain an intuitive feel for how complexity functions are classified. Table 1.3: The quadratic term eventually dominates*n* 0\.1*n*2 0\.1*n*2 + *n* + 100 10 10 120 20 40 160 50 250 400 100 1,000 1,200 1,000 100,000 101,100 The set of all complexity functions that can be classified with pure quadratic functions is called Θ(*n*2), where Θ is the Greek capital letter "theta." If a function is a member of the set Θ(*n*2), we say that the function is order of *n*2. For example, because we can throw away low-order terms, ![](https://box.kancloud.cn/e1008f36c8d5e96be035ac61cc7f4466_262x24.jpg) which means that *g*(*n*) is order of *n*2. As a more concrete example, recall from [Section 1.3.1](LiB0010.html#58) that the time complexity for [Algorithm 1.3](LiB0008.html#32) (Exchange Sort) is given by *T*(*n*) = *n*(*n* − 1)/2. Because ![](https://box.kancloud.cn/58572dd579d0f454cb568f41379f19bd_163x43.jpg) throwing away the lower-order term *n*/2 shows that *T*(*n*) ∊ Θ(*n*2). When an algorithm's time complexity is in Θ(*n*2), the algorithm is called a ***quadratic-time algorithm*** or a **Θ(*n*2) algorithm.** We also say that the algorithm is Θ(*n*2). Exchange Sort is a quadratic-time algorithm. Similarly, the set of complexity functions that can be classified with pure cubic functions is called Θ(*n*3), and functions in that set are said to be order of *n*3, and so on. We will call these sets ***complexity categories.*** The following are some of the most common complexity categories: ![](https://box.kancloud.cn/f9ac9b15450a5e8207c311b9c66f459b_400x20.jpg) In this ordering, if *f*(*n*) is in a category to the left of the category containing *g*(*n*), then *f*(*n*) eventually lies beneath *g*(*n*) on a graph. [Figure 1.3](#ch01fig03) plots the simplest members of these categories: *n*, ln *n*, *n* ln *n*, and so on. [Table 1.4](#ch01table04) shows the execution times of algorithms whose time complexities are given by these functions. The simplifying assumption is that it takes 1 nanosecond (10-9 second) to process the basic operation for each algorithm. The table shows a possibly surprising result. One might expect that as long as an algorithm is not an exponential-time algorithm, it will be adequate. However, even the quadratictime algorithm takes 31.7 years to process an instance with an input size of 1 billion. On the other hand, the Θ(*n* ln *n*) algorithm takes only 29.9 seconds to process such an instance. Ordinarily an algorithm has to be Θ(*n* ln *n*) or better for us to assume that it can process extremely large instances in tolerable amounts of time. This is not to say that algorithms whose time complexities are in the higher-order categories are not useful. Algorithms with quadratic, cubic, and even higher-order time complexities can often handle the actual instances that arise in many applications. [![Click To expand](https://box.kancloud.cn/538168e4202423f7efd3d5d0278f66c9_350x455.jpg)](fig1-3_0.jpg) Figure 1.3: • Growth rates of some common complexity functions. Table 1.4: Execution times for algorithms with the given time complexities*n* *f*(*n*) = *lg n* *f*(*n*) = *n* *f*(*n*) = *n lg n* *f*(*n*) = *n*2 *f*(*n*) = *n*3 *f*(*n*) = 2n 10 0\.003 μs\[[\*](#ftn.ch01fnt01)\] 0\.01 μs 0\.033 μs 0\.10 μs 1\.0 μs 1 μs 20 0\.004 μs 0\.02 μs 0\.086 μs 0\.40 μs 8\.0 μs 1 ms\[[†](#ftn.ch01fnt02)\] 30 0\.005 μs 0\.03 μs 0\.147 μs 0\.90 μs 27\.0 μs 1 s 40 0\.005 μs 0\.04 μs 0\.213 μs 1\.60 μs 64\.0 μs 18\.3 min 50 0\.006 μs 0\.05 μs 0\.282 μs 2\.50 μs 125\.0 μs 13 days 102 0\.007 μs 0\.10 μs 0\.664 μs 10\.00 μs 1\.0 ms 4 × 1013 years 103 0\.010 μs 1\.00 μs 9\.966 μs 1\.00 ms 1\.0 s 104 0\.013 μs 10\.00 μs 130\.000 μs 100\.00 ms 16\.7 min 105 0\.017 μs 0\.10 ms 1\.670 ms 10\.00 s 11\.6 days 106 0\.020 μs 1\.00 ms 19\.930 ms 16\.70 min 31\.7 years 107 0\.023 μs 0\.01 s 2\.660 s 1\.16 days 31,709 years 108 0\.027 μs 0\.10 s 2\.660 s 115\.70 days 3\.17 × 107 years 109 0\.030 μs 1\.00 s 29\.900 s 31\.70 years \[[\*](#ch01fnt01)\]1 ns = 10-9 second. \[[†](#ch01fnt02)\]1 ms = 10−3 second. Before ending this discussion, we stress that there is more information in knowing a time complexity exactly than in simply knowing its order. For example, recall the hypothetical algorithms, discussed earlier, that have time complexities of 100*n* and 0.01*n*2. If it takes the same amount of time to process the basic operations and execute the overhead instructions in both algorithms, then the quadratic-time algorithm is more efficient for instances smaller than 10,000. If the application never requires instances larger than this, the quadratic-time algorithm should be implemented. If we knew only that the time complexities were in Θ(*n*) and Θ(*n*2), respectively, we would not know this. The coefficients in this example are extreme, and in practice they are often less extreme. Furthermore, there are times when it is quite difficult to determine the time complexities exactly. Therefore, we are sometimes content to determine only the order. ### 1.4.2 A Rigorous Introduction to Order The previous discussion imparted an intuitive feel for order (Θ). Here we develop theory that enables us to define order rigorously. We accomplish this by presenting two other fundamental concepts. The first is "big *O*." Definition For a given complexity function *f*(*n*), *O*(*f*(*n*)) is the set of complexity functions *g*(*n*) for which there exists some positive real constant *c* and some nonnegative integer *N* such that for all *n* ≤ *N*, ![](https://box.kancloud.cn/0fd494aa7f16ec6f3d1f80e5733bec98_139x22.jpg) If *g*(*n*) ∊ *O*(*f*(*n*)), we say that *g*(*n*) is ***big O*** of *f*(*n*). [Figure 1.4(a)](#ch01fig04) illustrates "big *O*." Although *g*(*n*) starts out above *cf*(*n*) in that figure, eventually it falls beneath *cf*(*n*) and stays there. [Figure 1.5](#ch01fig05) shows a concrete example. Although *n*2 + 10*n* is initially above 2*n*2 in that figure, for *n* ≤ 10 ![](https://box.kancloud.cn/1b175d7c81e318b94b4ea02418709332_128x23.jpg) [![Click To expand](https://box.kancloud.cn/732c9302833b53fb03cf4784b71fad5b_350x111.jpg)](fig1-4_0.jpg) Figure 1.4: Illustrating "big *O*," Ω, and Θ. [![Click To expand](https://box.kancloud.cn/9cb91c1529a991a702f54b11b688505b_350x249.jpg)](fig1-5_0.jpg) Figure 1.5: The function *n*2 + 10*n* eventually stays beneath the function 2*n*2. We can therefore take *c* = 2 and *N* = 10 in the definition of “big *O* ![](https://box.kancloud.cn/e51b65696d6cdf59aa9679fb63f9faf1_155x26.jpg) If, for example, *g*(*n*) is in *O*(*n*2), then eventually *g*(*n*) lies beneath some pure quadratic function *cn*2 on a graph. This means that if *g*(*n*) is the time complexity for some algorithm, eventually the running time of the algorithm will be at least as fast as quadratic. For the purposes of analysis, we can say that eventually *g*(*n*) is at least as *good* as a pure quadratic function. "Big *O*" (and other concepts that will be introduced soon) are said to describe the **asymptotic** behavior of a function because they are concerned only with eventual behavior. We say that "big *O*" puts an ***asymptotic upper bound*** on a function. The following examples illustrate how to show "big *O*." Example 1.7**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**We show that 5*n*2∊ *O*(*n*2). Because, for *n* ≤ 0, ![](https://box.kancloud.cn/b6f79a2562fa16bdf9ecb2f72ad48081_88x24.jpg) we can take *c* = 5 and *N* = 0 to obtain our desired result. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** Example 1.8**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**Recall that the time complexity of [Algorithm 1.3](LiB0008.html#32) (Exchange Sort) is given by ![](https://box.kancloud.cn/b13e2f04eee2a6adad7fd32813edba72_146x42.jpg) Because, for *n* ≥ 0, ![](https://box.kancloud.cn/290c169de1de9526cd037aeed060a959_203x42.jpg) we can take *c* = 1/2 and *N* = 0 to conclude that *T*(*n*) ∊ *O*(*n*2). **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** A difficulty students often have with "big *O*" is that they erroneously think there is some unique *c* and unique *N* that must be found to show that one function is "big *O*" of another. This is not the case at all. Recall that Figure 1\.5 illustrates that *n*2 + 10*n*∊ O(*n*2 using *c = 2* and *N = 10*. Alternatively, we could show it as follows. Example 1.9**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**We show that *n*2 + 10*n* ∊ *O*(*n*2). Because, for *n* ≥ 1, ![](https://box.kancloud.cn/e439c09e4a078db5fb6949b5215565f3_241x23.jpg) we can take *c* = 11 and *N* = 1 to obtain our result. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** In general, one can show "big *O*" using whatever manipulations seem most straightforward. Example 1.10**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**We show that *n*2∊ O(*n*2 + 10*n*). Because, for *n* ≥ 0, ![](https://box.kancloud.cn/82422d529b84fbd0198a1f0c2fa048f8_171x26.jpg) we can take *c* = 1 and *N* = 0 to obtain our result. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** The purpose of this last example is to show that the function inside "big O" does not have to be one of the simple functions plotted in [Figure 1.3](#ch01fig03). It can be any complexity function. Ordinarily, however, we take it to be a simple function like those plotted in [Figure 1.3](#ch01fig03). Example 1.11**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**We show that *n* ∊ (*n* 2). Because, for *n* ≥ 1, ![](https://box.kancloud.cn/68419cf182572e780f6cb0e0048bcdf5_93x24.jpg) we can take *c* = 1 and *N* = 1 to obtain our result. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** This last example makes a crucial point about "big *O*". A complexity function need not have a quadratic term to be in O(*n*2). It need only eventually lie beneath some pure quadratic function on a graph. Therefore, any logarithmic or linear complexity function is in O(*n*2). Similarly, any logarithmic, linear, or quadratic complexity function is in O(*n*3), and so on. [Figure 1.6(a)](#ch01fig06) shows some exemplary members of O(*n*2). [![Click To expand](https://box.kancloud.cn/abc4d9f3c05270aae8000f788e99e3a4_350x125.jpg)](fig1-6_0.jpg) Figure 1.6: The sets O (*n*2), O (*n*2, O (*n*2). Some exemplary members are shown. Just as "big O" puts an asymptotic upper bound on a complexity function, the following concept puts an ***asymptotic lower bound*** on a complexity function. Definition For a given complexity function *f(n)*, Ω (*f(n)*) is the set of complexity functions *g(n)* for which there exists some positive real constant *c* and some nonnegative integer *N* such that, for all *n* ≥ *N*, ![](https://box.kancloud.cn/f27e75a7c40983af2e4825c58ee9fc58_139x22.jpg) The symbol Ω is the Greek capital letter "omega." If *g(n)* ∊Ω (*f(n)*), we say that *g(n)* is ***Omega*** of *f*(*n*). [Figure 1.4(b)](#ch01fig04) illustrates Ω. Some examples follow. Example 1.12**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**We show that 5*n*2. Because, for *n*≥0, ![](https://box.kancloud.cn/3ef92400819877440a80674c17b5f497_110x23.jpg) we can take *c* = 1 and *N* = 0 to obtain our result. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** Example 1.13**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**We show that *n*2 + 10*n* ∊ Ω (*n*2. Because, for *n* ≥ 0, ![](https://box.kancloud.cn/851314d2219f9d34323ae8e63ad6c30e_120x23.jpg) we can take *c* = 1 and *N* = 0 to obtain our result. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** Example 1.14**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**Consider again the time complexity of [Algorithm 1.3](LiB0008.html#23) (Exchange Sort). We show that ![](https://box.kancloud.cn/e2f39bcb99c17294810cfe84a71685ee_222x42.jpg) For *n* ≥ 2, ![](https://box.kancloud.cn/8ff3ae19c8aafb8eecc1cdae6fe00af9_89x36.jpg) Therefore, for *n* ≥ 2, ![](https://box.kancloud.cn/a6503665aa249fe5db665f3e77415f11_211x42.jpg) which means we can take *c* = 1/4 and *N* = 2 to obtain our result. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** As is the case for "big *O*," there are no unique constants *c* and *N* for which the conditions in the definition of Ω hold. We can choose whichever ones make our manipulations easiest. If a function is in Ω (*n*2), then eventually the function lies above some pure quadratic function on a graph. For the purposes of analysis, this means that eventally it is least as *bad* as a pure quadratic function. However, as the following example illustrates, the function need not be a quadratic function. Example 1.15**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**We show that *n*3∊ Ω(*n*2). Because, if *n* ≥ 1, ![](https://box.kancloud.cn/e03f8ac2eaaab460517663bf3d51dc1c_101x23.jpg) we can take *c* = 1 and *N* = 1 to obtain our result. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** [Figure 1.6(b)](#ch01fig06) shows some exemplary members of Ω (*n*2. If a function is in both *O*(*n*2) and Ω (*n*2), we can conclude that eventually the function lies beneath some pure quadratic function on a graph and eventually it lies above some pure quadratic function on a graph. That is, eventually it is at least as *good* as some pure quadratic function and eventually it is at least as bad as some pure quadratic function. We can therefore conclude that its growth is similar to that of a pure quadratic function. This is precisely the result we want for our rigorous notion of order. We have the following definition. Definition For a given complexity function *f*(*n*), ![](https://box.kancloud.cn/8807ea7b37df72331613cd543870e97f_260x22.jpg) This means that Θ (*f*(*n*)) is the set of complexity functions *g*(*n*) for which there exists some positive real constants *c* and *d* and some nonnegative integer *N* such that, for all *n*≥ *N*, ![](https://box.kancloud.cn/af3b458f45f42907dd08dc22ebacf0bc_232x22.jpg) If *g*(*n*) ∊ Θ (*f*(*n*)), we say that *g*(*n*) is ***order*** of *f*(*n*). Example 1.16**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**Consider once more the time complexity of [Algorithm 1.3](LiB0008.html#32). [Examples 1.8](#ch01ex22) and [1.14](#ch01ex28) together establish that ![](https://box.kancloud.cn/df6ca020b6fe38737166240b31b847dd_400x33.jpg) This means that *T*(*n*) ∊ O (*n*2) = Θ (*n*2) = Θ(*n*2) **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** [Figure 1.6(c)](#ch01fig06) depicts that Θ(*n*2) is the intersection of O(*n*2) and Ω(*n*2), whereas [Figure 1.4](#ch01fig04)(*c*) illustrates Θ. Notice in [Figure 1.6](#ch01fig06)(*c*) that the function 5*n* + 7 is not in Ω (*n*2), and the function 4*n*3 + 3*n*2 is not in O (*n*2). Therefore, neither of these functions is in Θ(*n*2). Although intuitively this seems correct, we have not yet proven it. The following example show how such a proof proceeds. Example 1.17**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**We shows that *n* is not in Ω(*n*2) by using ***proof by contradiction.***. In this type of proof we assume something is true—in this case, that *n* ∊Ω(*n*2—and then we do manipulations that lead to a result that is not true. That is, the result *contradicts* something known to be true. We then conclude that what we assumed in the first place cannot be true. Assuming that *n* ∊ Ω (*n*2) means we are assuming that there exists some positive constant *c* and some nonnegative integer *N* such that, for *n*≤*N*, ![](https://box.kancloud.cn/ef6981c0044eb0c1168194706a6bb800_69x22.jpg) If we divide both sides of this inequality by *cn*, we have, for *n*≥*N*, ![](https://box.kancloud.cn/f1084af9cd0c85b45c9b50547f58733d_54x40.jpg) However, for any *n* > 1/*c*, this inequality cannot hold, which means that it cannot hold for all *n*≥*N*. This contradiction proves that *n* is not in ≥ *n*2. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** We have one more definition concerning order that expresses relationships such as the one between the function *n* and the function *n*2. Definition For a given complexity function *f*(*n*), o(*f*(*n*)) is the set of all complexity functions *g*(*n*) satisfying the following: For every positive real constant *c* there exists a nonnegative integer *N* such that, for all *n*≥*N*, ![](https://box.kancloud.cn/fa64dcbc36436d484dc9a712a908b910_140x23.jpg) If *g*(*n*)∊ o(*f*(*n*)), we say that *g*(*n*), is ***small*** o of *f*(*n*). Recall that "big O" means there must be *some* real positive constant *c* for which the bound holds. This definition says that the bound must hold for *every* real positive constant *c*. Because the bound holds for every positive *c*, it holds for arbitrarily small *c*. For example, if *g*(*n*) ∊ o(*f*(*n*)), there is an *N* such that, for *n*> *N*, ![](https://box.kancloud.cn/f15caa083b1ecf8e083adf21e6cc6535_182x21.jpg) We see that *g*(*n*) becomes insignificant relative to *f*(*n*) as *n* becomes large. For the purposes of analysis, if *g*(*n*) is in o(*f*(*n*) is eventually much *better* than functions such as *f*(*n*). The following examples illustrate this. Example 1.18**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**We show that ![](https://box.kancloud.cn/df533826e4feb55c271bd64f8ccb2718_83x24.jpg) Let *c*> 0 be given. We need to find an *N* such that, for *n*≥*N*, ![](https://box.kancloud.cn/76f320af11f07039801eef635f648152_70x22.jpg) If we divide both sides of this inequality by *cn,*, we get ![](https://box.kancloud.cn/90b014bc965a1f13769f54d2a4ffe20d_54x41.jpg) Therefore, it suffices to choose any *N* ≥ 1/*c*. Notice that the value of *N* depends on the constant *c*. For example, if *c* = 0.00001, we must take *N* equal to at least 100,000. That is, for *n* ≥ 100,000, ![](https://box.kancloud.cn/23e22883c7ea1562b54c527db55130da_120x22.jpg) **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** Example 1.19**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**We show that *n* is not in o(5*n*). We will use proof by contradiction to show this. Let *c* = 1/6. If *n* ∊ o (5*n*), then there must exist some *N* such that, for *n*≥ *N*, ![](https://box.kancloud.cn/e40b37fd00d4c20d10fc20f69d89f68b_125x42.jpg) This contradiction proves that *n* is not in o(5*n*). **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** The following theorem relates "small o" to our other asymptotic notation. Theorem 1.2**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**If *g*(*n*) ∊ o (*f*(*n*)), then ![](https://box.kancloud.cn/231c23f510f256317ea15d9784313e71_230x22.jpg) That is, *g*(*n*) is in O (*f*(*n*))but is not in Ω(*f(n))*. Proof: Because *g(n)∊o(f(n))*, for every positive real constant *c* there exists an *N* such that, for all *n*∊*N*, ![](https://box.kancloud.cn/4d185f67ef4cb864a0b4afb2efa7c3a1_140x22.jpg) which means that the bound certainly holds for some *c*. Therefore, ![](https://box.kancloud.cn/2b7fc9af5b06133d797be52d6b10d5b3_139x22.jpg) We will show that *g*(*n*) is not Ω (*f*(*n*)) using proof by contradiction. If *g*(*n*) ∊ Ω(*f*(*n*)), then there exists some real constant *c* > 0 and some *N*1 such that, for all *n*≥*N*1, ![](https://box.kancloud.cn/ce255a73a9ac44265b4ae0ed262f5b6f_131x22.jpg) But, because *g*(*n*) ∊ o (*f*(*n*)), there exists some *N*2 such that, for all *n* ≥ *N*2, ![](https://box.kancloud.cn/d59006261385ec3553fc8287221921de_146x36.jpg) Both inequalities would have to hold for all *n* greater than both *N*1, *N*2. This contradiction proves that *g*(*n*) cannot be in Ω (*f*(*n*)). **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** You may think that o (*f*(*n*)) and O (*f*(*n*)) – Ω (*f*(*n*)) must be the same set. This is not true. There are unusual functions that are in O (*f*(*n*)) – Ω (*f*(*n*)) but that are not in o (*f*(*n*)). The following example illustrates this. Example 1.20**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**Consider the function ![](https://box.kancloud.cn/5c517a44a15c5129d79b11f00ddd1165_205x58.jpg) It is left as an exercise to show that ![](https://box.kancloud.cn/f8e36b8f59452eefcb0642c6ac7df6b8_400x20.jpg) **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** [Example 1.20](#ch01ex29), of course, is quite contrived. When complexity fuctions represent the time complexities of actual algorithms, ordinarily the functions in *O*(f(n)) – Ω(f(n)) are the same ones that are in *o*(*f*(*n*)). ![](https://box.kancloud.cn/efa064ddc37a2a153d3d0b88e6a0eacc_400x22.jpg) For example, ![](https://box.kancloud.cn/f5efd2b2409e2b8a694056d9d73f56b7_384x24.jpg) This means that Θ separates complexity functions into disjoint sets. We will call these sets ***complexity categories.*** Any function from a given category can represent the category. For convenience, we ordinarily represent a category by its simplest member. The previous complexity category is represented by Θ(*n*2). The time complexities of some algorithms do not increase with *n* For example, recall that the best-case time complexity *B* *n* for [Algorithm 1.1](LiB0008.html#27) is 1 for every value of *n*. The complexity category containing such functions can be represented by any constant, and for simplicity we represent it by Θ (1). The following are some important properties of order that make it easy to determine the orders of many complexity functions. They are stated without proof. The proofs of some will be derived in the exercises, whereas the proofs of others follow from results obtained in the next subsection. The second result we have already discussed. It is included here for completeness. Properties of Order: 1. *g* (*n*) ∊ *O*(*fn*)) if and only if *f(n)* ∊ Ω(*g*(*n*)). 1. *g*(*n* ∊ Θ (*f*(*n*)) if and only if *f*(*n*∊ Θ(*g*(*n*)). 1. If *b*> 1 and *a* > 1, then loga*n* ∊ Θ (logb*n*). This implies that all logarithmic complexity functions are in the same complexity category. We will represent this category by Θ(lg*n*). 1. If *b* > *a* > 0, then ![](https://box.kancloud.cn/e571ae463d6fcaf53225bfe0caeb31ae_94x22.jpg) This implies that all exponential complexity functions are not in the same complexity category. 1. For all *a* > 0 ![](https://box.kancloud.cn/a2546e52187e83c008452c11f13c6660_93x22.jpg) This implies that *n!* is *worse* than any exponential complexity function. 1. Consider the following ordering of complexity categories: ![](https://box.kancloud.cn/2cedd57fa6cecb68d7f6883aee9f7b92_400x16.jpg) where *k* > *j* and *b* > *a* 1. If a complexity function *g*(*n*), is in a category that is to the left of the category containing *f*(*n*), then ![](https://box.kancloud.cn/92082ab5430e35ac49f902cadd497201_134x22.jpg) 1. If *c* ≥ 0, *d* > 0, *g*(*n*), ∊*O*(f (*n*)), and *h(n)* ∊ Θ (*f*(*n*)), then ![](https://box.kancloud.cn/76d4c41777e64025801c57453fbe6338_259x23.jpg) Example 1.21**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**Property 3 states that all logarithmic complexity functions are in the same complexity category. For example, ![](https://box.kancloud.cn/a2db289ede2d259944faf33c222d2a3c_159x23.jpg) This means that the relationship between log4*n* and lg *n* is the same as the one between 7*n*2 + 5*n* and *n*2. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** Example 1.22**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**Property 6 states that any logarithmic function is eventually better than any polynomial, any polynomial is eventually better than any exponential function, and any exponential function is eventually better than the factorial function. For example, ![](https://box.kancloud.cn/abd377b10043fb547f80ccda62d7900a_400x24.jpg) **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** Example 1.23**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**Properties 6 and 7 can be used repeatedly. For example, we can show that 5*n*+31*gn*+10*n* lg*n*+7*n*2∊ Θ (*n*2), as follows. Repeatedly applying Properties 6 and 7, we have ![](https://box.kancloud.cn/43260e4eb5a009837da3ce5beeed28d6_104x24.jpg) **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** In practice, we do not repeatedly appeal to the properties, but rather we simply realize that we can throw out low-order terms. If we can obtain the exact time complexity of an algorithm, we can determine its order simply by throwing out low-order terms. When this is not possible, we can appeal back to the definitions of "big *O*" and Ω to determine order. For example, suppose for some algorithm we are unable to determine *T*(*n* \[or *W*(*n),* *A*(*n* or *B*(*n*)\] exactly. If we can show that > *T*(*n* ∊ *O* (*f*(*n*)) and *T*(*n* ∊ δ (*f*(*n*)) by appealing directly to the definitions, we can conclude that *T*(*n*) ∊ Θ *fn*)). Sometimes it is farily easy to show that *Tn* ∊ *O(**f*(*n*)) but difficult to determine whether *T*(*n* is in Ω *f*(*n*)). In such cases we may be content to show only that *T* ∊ *O*(*f*(*n*)), because this implies that *T*(*n*) is at least as *good* as functions such as *f*(*n*). Similarly, we may be content to learn only that *T*(*n*) ∊ Θ*(*f(*n*)), because this implies that *Tn*) is at least *as bad* as functions such as *fn* Before closing, we mention that many authors say > *f*(*n*) = Θ (*n*2 instead of ∊ Θ (*n*2). Both mean the same thing—namely, that *f*(*n)* is a member of the set Θ*n*)2). Similarly, it is common to write ![](https://box.kancloud.cn/bb03694d6fe772b03e1cf47cce2d41c7_366x25.jpg) You are referred to Knuth (1973) for an account of the history of "order" and to Brassard (1985) for a discussion of the definitions of order given here. Our definitions of "big *O*," Ω, and Θ are, for the most part, standard. There are, however, other definitions of "small *o*." It is not standard to call the sets Θ*n*, Θ*n*)2, and so on, "complexity categories." Some authors call them "complexity classes," although this term is used more often to refer to the sets of problems discussed in [Chapter 9](LiB0074.html#880). Other authors do not give them any particular name at all. ### 1.4.3 Using a Limit to Determine Order ![](https://box.kancloud.cn/fe52dc567423925ec4693ad45a0553a1_21x21.jpg) We now show how order can sometimes be determined using a limit. This material is included for those familiar with limits and derivatives. Knowledge of this material is not required elsewhere in the text. Theorem 1.3**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**We have the following: ![](https://box.kancloud.cn/0708248d006addb65b92ca0e2fe1d93a_400x78.jpg) Proof: The proof is left as an exercise. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** Example 1.24**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**[Theorem 1.3](#ch01ex33) implies that ![](https://box.kancloud.cn/a719abfb307c78a8b1f7da246cfcbc0e_90x42.jpg) because ![](https://box.kancloud.cn/40ff91943e8af49d984bfe64f9e65652_210x44.jpg) **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** Using [Theorem 1.3](#ch01ex33) in [Example 1.24](#ch01ex34) is not very exciting because the result could have easily been established directly. The following examples are more interesting. Example 1.25**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**[Theorem 1.3](#ch01ex33) implies that, for *b* > *a* > 0, ![](https://box.kancloud.cn/c774e25a71fa2da241d87c29141f07ad_86x22.jpg) The limit is 0 because 0 < *a/b* < 1. This is Property 4 in the Properties of Order (near the end of [Section 1.4.2](#ch01lev2sec7)). **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** Example 1.26**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**[Theorem 1.3](#ch01ex33) implies that, for *a* > 0, ![](https://box.kancloud.cn/7e5fcfd9df8f6a3f65973d253b8935a0_214x41.jpg) If *a* ≥ 1, the result is trivial. Suppose that *a* > 1. If *n* is so large that ![](https://box.kancloud.cn/0119ae97112531a64577ac0ee99d6466_93x23.jpg) Because *a* > 1, the implies that ![](https://box.kancloud.cn/2aa1fac90f981194ce912534a46c31b2_79x37.jpg) This is Property 5 in the Properties of Order. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** The following theorem, whose proof can be found in most calculus texts, enhances the usefulness of [Theorem 1.3](#ch01ex33). Theorem 1.4**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)***L'Hôpital's Rule* If *f*(*x*) and *g*(*x*) are both differentiable with derivatives *f*'(*x*) and *g'* (*x*), respectively, and if ![](https://box.kancloud.cn/a5b8e9dcad5ca886435660c58a8b0e24_352x72.jpg) whenever the limit on the right exists. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** [Theorem 1.4](#ch01ex37) holds for functions of real valuables, whereas our complexity functions are functions of integer variables. However, most of our complexity functions (for example, lg *n*, *n*, etc.) are also functions of real variables. Furthermore, if a function *f*(*x*) is a function of a real variable *x*, then ![](https://box.kancloud.cn/1c546c06b390f7891938546146ebef43_191x29.jpg) where *n* is an integer, whenever the limit on the right exists. Therefore, we can apply [Theorem 1.4](#ch01ex37) to complexity analysis, as the following examples illustrate. Example 1.27**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**[Theorems 1.3](#ch01ex33) and [1.4](#ch01ex37) imply that ![](https://box.kancloud.cn/36aa6a94898d7f3dccac07dbe0690a5e_89x22.jpg) because ![](https://box.kancloud.cn/2243071cb270de76a7ad4325fc5309c4_400x45.jpg) **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** Example 1.28**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**[Theorems 1.3](#ch01ex33) and [1.4](#ch01ex37) imply that, for *b* > 1 and *a* > 1, ![](https://box.kancloud.cn/d44680396c6cd43fff1fd96f68841484_145x23.jpg) because ![](https://box.kancloud.cn/9c84c162dfb7c9f90cb1bed4a3fb7a9a_400x40.jpg) This is Property 3 in the Properties of Order. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**