ThinkChat🤖让你学习和工作更高效,注册即送10W Token,即刻开启你的AI之旅 广告
## 3.3 Dynamic Programming and Optimization Problems Recall that [Algorithm 3.4](LiB0026.html#280) not only determines the lengths of the shortest paths but also constructs shortest paths. The construction of the optimal solution is a third step in the development of a dynamic programming algorithm for an optimization problem. This means that the steps in the development of such an algorithm are as follows: 1. *Establish* a recursive property that gives the optimal solution to an instance of the problem. 2. Compute the value of an optimal solution in a *bottom-up* fashion. 3. Construct an optimal solution in a *bottom-up* fashion. Steps 2 and 3 are ordinarily accomplished at about the same point in the algorithm. Because [Algorithm 3.2](LiB0025.html#263) is not an optimization problem, there is no third step. Although it may seem that any optimization problem can be solved using dynamic programming, this is not the case. The principle of optimality must apply in the problem. That principle can be stated as follows: Definition聽The ***principle of optimality*** is said to apply in a problem if an optimal solution to an instance of a problem optimal solutions to all substances. The principle of optimality is difficult to state and can be better understood by looking to an example. In the case of the Shortest Paths problem we showed that if *vk* is a vertex on an optimal path form *vk* to *vj*, then the subpaths from *ui* to *uk* and from *vk* to *vj* must also be optimal. Therefore, the optimal solution to the instance contains optimal solutions to all subsinstances, and the principle of optimality applies. If the principle of optimality applies in a given problem, we can develop a recursive property that gives an optimal solution to an instance in terms of optimal solutions to subinstances. The important but subtle reason we can then use dynamic programming to construct an optimal solution to an instance is that the optimal solutions to the subinstances can be any optimal solutions. For example, in the case of the Shortest Paths problem, if the subpaths are any shortest paths, the combined path will be optimal. We can therefore use the recursive property to construct optimal solutions to increasingly large instances from the bottom up. Each solution along the way will always be optimal. Although the principle of optimality may appear obvious, in practice it is necessary to show that the principle applies before assuming that an optimal solution can be obtained using dynamic programming. The following example shows that it does not apply in very optimization problem. Example 3.4**![Start example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)**Consider the Longest Paths problem of finding the longest simple paths from each vertex to all other vertices. We restrict the Longest Paths problem to simple paths because with a cycle we can always create an arbitrarily long path by reapeatedly passing through the cycle. In [Figure 3.6](#ch03fig06) the optimal (longest) simple path form *v*1 to *v*4 is \[*v*1, *v*3, *v*2, *v*4\]. However, the subpath \[*v*1, *v*3\] is not an optimal (longest) path from *v*1 to *v*3 because ![](https://box.kancloud.cn/595754bde01310bf547907dd62e35012_400x28.jpg) [![Click To expand](https://box.kancloud.cn/e816081824b3b97b26e756d6d4ac12b8_257x361.jpg)](fig3-6_0.jpg) Figure 3.6: A weighted, directed graph with a cycle. Therefore, the principle of optimality does not apply. The reason for this is that the optimal paths from *v*1 to *v*3 and from *v*3 to *v*4 cannot be strung together to given an optimal path from *v*1 to *v*4. Doing this would create a cycle rather than an optimal path. **![End example](https://box.kancloud.cn/e95df9d604ab2d89febe370ae4d88fb1_1x1.gif)** The remainder of this chapter is concerned with optimization problems. When developing the algorithms, we will not explicitly mention the steps outlined earlier. It should be clear they are being followed.