0
$\begingroup$

TL;DR: How do you interpret matrix/vector multiplication if the matrix is a graph adjacency matrix?

There exists a bijection between $\mathbb{R}^{m \times m}$ and directed weighted graphs. Specifically, you can define $M \in \mathbb{R}^{m \times m}$ as the adjacency matrix of a weighted graph with $m$ nodes, where $M_{ij}$ is the weight of the edge from nodes $i \rightarrow j$.

One famous application of this is in Markov processes. We define a markov process to be any weighted graph with the property that the outgoing weight from every node sums to 1. So all rows of the adjacency matrix sum to 1.

Let $P_m$ be the set of $m$ node markov processes- $P_m \subset \mathbb{R}^{m \times m}$.

If you have some markov process $\bar{M} \in P_m$, and you also have some "weight" vector $x \in \mathbb{R}^{m}, |x| = 1$, then $$\bar{M}x$$

Will output a "new" weight vector, $|\bar{M}x| = 1$, which represents what happens to the weight vector $x$ after one timestep on the markov process.

If you have

$$\bar{M}x^{*} = x^{*}$$ then $x^{*}%$ (an eigenvector of $\bar{M}$) is the steady state distribution of the markov process. This fact underlies the famous PageRank algorithm that catalyzed Google. Specifically modelling the web as a markov process, $x^*$ is the pagerank vector of the web, and can be used for determining importance of web pages.

Thus for Markov processes, I have an intuitive understanding of what matrix multiplication "means" for unit vectors and markov processes. But I can't help but wonder if there is a similar intuitive understanding if you relax the markov process/unit vector constraint.


My question specifically: What does matrix multiplication represent in the more general case of $\mathbb{R}^{m \times m}$?, and $x \in \mathbb{R}^m$? ie: when the sum of outgoing weights does not sum to 1? Is there an "intuitive" explanation of what this physically means? Ie: "number of paths between two nodes" is more in line than some double summation formula with no motivation.

$\endgroup$
2
  • $\begingroup$ If $v=(v_i)$ is the vector and the result is $w=(w_i),$ then $w_i$ is the sum of the $v_j$ with node $j$ adjacent to node $i.$ So if we think of $v_i$ as the value of a node $i$, then $w_i$ is the sum of the values of the nodes adjacent to $i.$ There isn't much more than that. (If the graph is directed, then "adjacent" needs to be given a direction - there needs to be an edge from nod $i$ to node $j$ to have the value counted.) $\endgroup$ Commented Aug 22 at 19:13
  • $\begingroup$ Nitpicking: With the Markov process, only rows sum to 1, columns not necessarily $\endgroup$ Commented Aug 22 at 20:43

1 Answer 1

0
$\begingroup$

Matrix multiplication with weights in $\{0,1\}$ is just a way of writing down the different combinatorial options or paths on a graph.

Now, if you allow weights in $\mathbb{R}$, then you allow 'dampening' a 'signal' or 'amplifying' it (including 'sign changes') on the way/ when passing an edge.

$\endgroup$

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.