Shortest paths and matrix multiplication
A recursive solution to the all-pairs shortest-paths problem
Now, let $l_{ij}^{(m)}$ be the minimum weight of any path from vertex i to vertex j that contains at most m edges. When $m=0$, there is a shortest path from i to j with no edges if and only if $i=j$. Thus,


The actual shortest-path weights are given by
$\delta(i,j) = l^{(n-1)}{ij} = l^{(n)}{ij} = l^{(n+1)}_{ij} = …$
Because there is a shortest path from i to j that is simple and thus contains at most n-1 edges.
Computing the shortest-path weights bottom up

Which extends the shortest paths computed so far by one more edge.

Running time: $\Theta(n^4)$
Improving the running time

Running time: $\Theta(n^3 lgn)$
The Floyd-Warshall algorithm

Running time: $\Theta(n^3)$
Transitive closure of a directed graph
We define the transitive closure of G as the graph $G^=(V, E^)$, where $E^* = (i,j):$ there is a path from vertex i to vertex j in G.
One way to compute the transitive closure of a graph in $\Theta(n^3)$ time is to assign a weight of 1 to each edge of E and run the Floyd-Warshall algorithm. If there is a path from vertex i to vertex j, we get $d(ij) < n$. Otherwise, we get $d(ij) =1$.
Johnson’s algorithm for sparse graphs
Johnson’s algorithm finds shortest paths between all pairs in $O(V^2lgV+VE)$ time.
