Index      Number of Walks of Length n  Problems  Change Language (Farsi)   Source

Number of Walks of Length n

Using the graph's adjacency matrix and operations defined on matrices, an algorithm can be provided for obtaining the number of walks of length n. To understand this section, it is necessary to be familiar with matrix multiplication, a well-known operation, and have a good intuition for it. You can search for matrix multiplication online to familiarize yourself with it.

Meaning of Matrix Multiplication in Graphs

Consider two n-vertex graphs. For instance, let's call these two graphs the blue graph with adjacency matrix A and the red graph with adjacency matrix B. We want to find the number of walks between two vertices i and j, such that they have two edges, where the first edge is blue and the second is red. We denote this count as \(C_{i,j}\). It is clear that to calculate this value, one can enumerate over the intermediate vertex of this walk (a vertex like k), then multiply the number of blue edges from i to k by the number of red edges from k to j, and sum the obtained values for all possible k. Mathematically:

\[C_{i,j} = \sum\limits_{k=1}^{n} A_{i,k}B_{k,j}\]

With a little observation, it can be understood that matrix C is equal to the product of matrix A and B.

Matrix Power

We define matrix power, similar to powers of numbers, such that \(A^n\) means A multiplied by itself n times. According to the theorem we showed above, it can be concluded that the number of walks of length n from i to j is equal to the (i,j)-th entry in \(A^n\). For the proof, let each walk of length n-1 be a 'red' edge, and each edge of the graph be a 'blue' edge. Clearly, every walk of length n is equivalent to a walk composed of one 'red' edge and one 'blue' edge. And since \(A^n = A^{n-1}A\), the statement can be proven by induction.

Algorithm and Complexity

Therefore, to solve the problem, it suffices to be able to compute matrix powers efficiently. Since matrix multiplication is associative (i.e., \((AB)C = A(BC)\)), the order in which we compute the power does not matter. If the exponent is even, we have:

\[A^{2k} = (A^k)^2\]

And if it's odd, we have:

\[A^{2k+1} = A(A^k)(A^k)\]

If we compute matrix multiplication using its naive algorithm, which is \(O(n^3)\), we can recursively compute \(A^k\) by first calculating \(A^{\lfloor\frac{k}{2}\rfloor}\) and then using the above relations to find the answer. The runtime of this recursive algorithm is:

\[T(k) = T(\frac{k}{2}) + O(n^3) = O(n^3lg(k))\]

This algorithm is called fast exponentiation (or binary exponentiation), and it can be used whenever we want to repeat an associative operation on an element multiple times. For example, you can use this same algorithm for powers of numbers.

Generalization

The algorithm we discussed here is not only applicable for calculating the number of walks of fixed length, but it can also be used to find other properties of fixed-length walks (such as the longest walk or the walk with the minimum weight edge). For example, suppose we want to find a walk of length k between two vertices such that the sum of its edge weights is maximized. If we want to solve this problem for a graph where its edges are red and blue, and we are interested in walks with one red edge and one blue edge, the answer is obtained as follows:

\[C_{i,j} = \max\limits_{k=1}^{n} A_{i,k} + B_{k,j}\]

We can define matrix C as the 'composition' of matrices A and B. With a little observation, it can be understood that this composition operation is associative, similar to matrix multiplication. For proof, you can consider a graph with red, blue, and green edges, and find the walk that is red-blue-green and has the maximum sum of edge weights using two different methods. Thus, similar to before, matrix powers can be defined, and it can be proven by induction that \(A^k_{i,j}\) equals the largest walk of length k between two vertices i and j, and an \(O(n^3lg(k))\) algorithm exists for computing this matrix.