DAGs. One very special type of graph is the directed acyclic graph, also called a DAG.
DAGSPT. While Dijkstra's will work fine on any DAG with non-negative edges, and will complete in $O(E \log V + V \log V)$ time, we can do slightly better performance wise using the DAGSPT algorithm. DAGSPT simply consists of first finding a topological ordering of the vertices, then relaxing all of the edges from each vertex in topological order. The runtime of this algorithm is $\Theta(E + V)$. This algorithm does not work general graphs, only DAGs. This algorithm works, even if we have negative edges.
Dynamic Programming. Dynamic programming is the following process:
- Identify a collection of subproblems.
- Solve subproblems, working from smallest to largest.
- Use the answers from the smaller problems to help solve the larger ones.
Dynamic Programming and SPT. The DAGSPT algorithm is an example of dynamic programming. We start by solving the simplest subproblem (distance from the source to the source), and then use the results of that subproblem to solve larger subproblems (distance from the source to vertices two edges away), and so forth.
Longest Increasing Subsequence. As an example of DP, we studied the longest increasing subsequence (LIS) problem. This is not because the LIS problem is particularly important, but rather than its one of the easiest non-trivial DP problems. The LIS problem is: Given a sequence of numbers, find the longest (not necessarily contiguous) subsequence that is increasing. For example, given the sequence [6, 2, 8, 4, 5, 7], the LIS is [2, 4, 5, 7]. The length of the longest increasing subsequence (LLIS) problem says to simply find the length of the LIS. For example, given the sequence [6, 2, 8, 4, 5, 7], the LLIS is 4.
The LLIS Problem and Longest Path. We can represent an instance of the LLIS problem as a DAG. We create one vertex per item in the sequence numbered from 0 to N - 1. We draw an arrow from i to j if the ith element of the sequence is less than the jth element. The solution to the LLIS problem is then simply: Find the path with the most edges in this DAG, starting from any vertex.
Solving the Most-Edges Path Problem. One solution to the most-edges path problem is to set all edge weights to -1, then use the DAGSPT algorithm to find the shortest-paths-tree from each vertex. Since the edge weights all have weight -1, the shortest path in each SPT will be the one with the most edges. Or in other words, given the SPT for vertex i, the LLIS that starts at item i will be given by the absolute value of the minimum of the
distTo array. In class, I called this DAG problem "longest path", but in retrospect, I think "most edges path" is a clearer name, so I will use it throughout this study guide instead.
Implementing Our Most-Edges Based Approach. Our LLIS algorithm thus consists of first forming a DAG with edges of length -1, then finding a topological ordering on this graph, then running DAGSPT N times and recording the minimum of the
distTo array, and finally returning 1 + the absolute value of the minimum of these minimums. The runtime of this algorithm is $O(N^3)$. See the B level problems and/or extra slides for why.
Reduction. Transforming a problem from one domain and solving it in that new domain is sometimes called "reduction". For example, we reduced the LLIS problem to N solutions of the most-edges on a DAG problem. Other informal examples of reduction: we can reduce "illuminating a room" to "flipping a light switch", we can reduce "getting to work" to "riding BART", we can reduce "8puzzle" to "A".
Inefficiency of Using Most-Edges to Solve LLIS.* Reduction to N-solutions of the most-edges path problem works, but is inefficient. In particular, we observe that the latest shortest paths trees are in fact sub-trees of the earlier shortest paths problem. In effect, we are solving a bunch of subproblems in the wrong order!
LLIS Using DP. Our earlier approach boiled down to solving the LLIS problem starting from each separate vertex V. However, if we solve the LLIS problem ending at each vertex V, we can save ourselves lots of work. Define Q(K) to be the LLIS ending at vertex number K. For example, given the sequence [6, 2, 8, 4, 5, 7], Q(3) is 2, since the length of the longest path ending at 4 (which is item #3) is 2. Calculating these Q(K) values are the subproblems of our DP approach.
Using Smaller Qs to Solve Larger Qs.* For our approach to be dynamic programming, we must be able to solve our larger subproblems in terms of our smaller ones. Trivially, we have that Q(0) is 1, since the LLIS ending at vertex 0 is 1. For subsequent Q(K), we have that Q(K) is equal to 1 + Q(M), where M is the vertex such that Q(M) is the largest known Q value such that there is an edge from K to M (equivalently that the Kth item is less than the Mth item).
Using Qs to Solve Our DP. Supposing that we have calculated Q(K) for all K. Recalling that Q(K) is defined as the length of the largest subsequence ending at K, then the length of the largest subsequence is simply the maximum of all Q(K).
Implementing our LLIS DP Algorithm. While the DAG is a useful abstraction to guide our thinking, it is ultimately unnecessary. Our solution to LLIS can be simply presented as:
- Create Q, an array of length N. Set Q to 1, and Q[K] to negative infinity.
- For each K = 1, ..., N, then for each item L = 0, ... K - 1, if item L is less than item K, and if Q[L] + 1 > Q[K], then set Q[K] = Q[L] + 1
The runtime or this algorithm is simply $\Theta(N^2).
- Why is the runtime of the DAGSPT-based LLIS algorithm $O(N^3)$? Answer below (highlight to see):
- $\Theta(E + V)$ to create the weighted DAG.
- $\Theta(E + V)$ to compute a topological ordering of the vertices.
- $V$ executions of DAGSPT, for a total runtime of $\Theta(EV + V^2)$.
- $\Theta(V)$ time to find the minimum of each
- $\Theta(V)$ time to find the minimum of minimums.
In turn, we have $O(N^2)$ edges, and exactly $N$ vertices, so the overall runtime is $O(N^3)$.
- Try out some of the problems (with animated solutions) from Brian Dean's website. Some of these are A level problems.