101
$\begingroup$

First of all, am I being crazy in thinking that if $\lambda$ is an eigenvalue of $AB$, where $A$ and $B$ are both $N \times N$ matrices (not necessarily invertible), then $\lambda$ is also an eigenvalue of $BA$?

If it's not true, then under what conditions is it true or not true?

If it is true, can anyone point me to a citation? I couldn't find it in a quick perusal of Horn & Johnson. I have seen a couple proofs that the characteristic polynomial of $AB$ is equal to the characteristic polynomial of $BA$, but none with any citations.

A trivial proof would be OK, but a citation is better.

$\endgroup$
2
  • 2
    $\begingroup$ Have you tried to find a counter example using basic 2x2 matrices? $\endgroup$ Commented Mar 27, 2012 at 1:16
  • 2
    $\begingroup$ Here are 7 proofs: ias.ac.in/article/fulltext/reso/007/01/0088-0093 $\endgroup$ Commented Oct 19, 2022 at 9:39

4 Answers 4

150
$\begingroup$

If $v$ is an eigenvector of $AB$ for some nonzero $\lambda$, then $Bv\ne0$ and $$\lambda Bv=B(ABv)=(BA)Bv,$$ so $Bv$ is an eigenvector for $BA$ with the same eigenvalue. If $0$ is an eigenvalue of $AB$ then $0=\det(AB)=\det(A)\det(B)=\det(BA)$ so $0$ is also an eigenvalue of $BA$.

More generally, Jacobson's lemma in operator theory states that for any two bounded operators $A$ and $B$ acting on a Hilbert space $H$ (or more generally, for any two elements of a Banach algebra), the non-zero points of the spectrum of $AB$ coincide with those of the spectrum of $BA$.

$\endgroup$
6
  • 2
    $\begingroup$ Bob, I am a little confused on your proof. How did you know to use the trick $Bv \ne 0$ to prove this? $\endgroup$ Commented Nov 24, 2012 at 1:54
  • 7
    $\begingroup$ @diimension You just take an eigenvector of $AB$ and do the only calculation you can. Then it turns out that $Bv$ fulfills the eigenvector equation for $BA$, so you hope that it is not 0 and check this in the end. There's no need to know it at the start of the calculation. $\endgroup$ Commented Dec 21, 2012 at 17:27
  • 1
    $\begingroup$ @Phira thank you very much! $\endgroup$ Commented Dec 26, 2012 at 8:21
  • 2
    $\begingroup$ I know this is an ancient thread but hopefully you're still lurking out there somewhere. How come you need to hope that $\lambda\ne 0$? Doesn't the argument still work just fine in the case where $\lambda=0$? $\endgroup$ Commented Apr 22, 2013 at 12:32
  • 4
    $\begingroup$ @crf No -- the trouble is that when $Bv=0$, maybe $v$ is not in the range of $A$. The argument given for $\lambda\ne0$ works in Hilbert space, for example, but for $\lambda=0$ the result is not generally true there: On the infinite sequence space $\ell^2$, let $A$ be the right shift ($A(x_1,x_2,\ldots)=(0,x_1,x_2,\ldots)$, and $B$ the left shift. Then $AB(1,0,\ldots)=0$, but $BA$ is the identity. $\endgroup$ Commented May 7, 2013 at 15:31
33
$\begingroup$

It is true that the eigenvalues (counting multiplicity) of $AB$ are the same as those of $BA$.

This is a corollary of Theorem 1.3.22 in the second edition of "Matrix Analysis" by Horn and Johnson, which is Theorem 1.3.20 in the first edition.

Paraphrasing from the cited Theorem: If $A$ is an $m$ by $n$ matrix and $B$ is an $n$ by $m$ matrix with $n \geq m$ then the characteristic polynomial $p_{BA}$ of $BA$ is related to the characteristic polynomial $p_{AB}$ of $AB$ by $$p_{BA}(t) = t^{n-m} p_{AB}(t).$$

In your case, $n = m$, so $p_{BA} = p_{AB}$ and it follows that the eigenvalues (counting multiplicity) of $AB$ and $BA$ are the same.

You can see Horn and Johnson's proof in the Google Books link above. A similar proof was given in this answer from Maisam Hedyelloo.

$\endgroup$
2
  • 1
    $\begingroup$ Nice answer, thanks!! Is there a way to relate the eigenvectors of $AB$ and $BA?$ when $A, B$ have different, compatible dimensions?Yes, I see that if $v\ne 0$ is an eigenvector of $AB$, then $BA(Bv)= \lambda(Bv),$ provided $Bv \ne 0.$ So there's a catch there it seems, what I'm looking for is a theorem that'd completely relates the eigenvectors of $AB$ and $BA.$ I'm more interested in the case where $A=X, B=X^{T}.$ $\endgroup$ Commented Apr 6, 2020 at 18:22
  • $\begingroup$ Yes, this has the added benefit of proving the equal algebraic multiplicity. $\endgroup$ Commented Apr 27, 2023 at 13:39
31
$\begingroup$

Here is an alternative proof for this result, following Exercises 6.2.8-9 of Hoffman & Kunze's Linear Algebra (p. 190):


Lemma: Let $A,B\in M_n(\mathbb{F})$, where $\mathbb{F}$ is an arbitrary field. If $I-AB$ is invertible, then so is $I-BA$, and

$$(I-BA)^{-1}=I+B(I-AB)^{-1}A.$$

Proof of Lemma: Since $I-AB$ is invertible,

\begin{align} &I=(I-AB)(I-AB)^{-1}=(I-AB)^{-1}-AB(I-AB)^{-1}\\ &\implies (I-AB)^{-1} = I+ AB(I-AB)^{-1}. \end{align}

Then we have

\begin{align} I+B(I-AB)^{-1}A&= I+B[I+ AB(I-AB)^{-1}]A= I+BA+BAB(I-AB)^{-1}A\\ \implies I&=I+B(I-AB)^{-1}A-BA-BAB(I-AB)^{-1}A\\ &=I[I+B(I-AB)^{-1}A]-BA[I+B(I-AB)^{-1}A]\\ &=(I-BA)[I+B(I-AB)^{-1}A].\checkmark. \end{align}


Proposition: $\forall A,B\in M_n(\mathbb{F}):$ $AB$ and $BA$ have the same eigenvalues.

Proof: Let $\alpha\in\mathbb{F}$ be an eigenvalue of $AB$. If $\alpha=0$, then $0=\det(0I-AB)=\det(-A)\det(B)=\det(B)\det(-A)=\det(0I-BA)$ and so $0$ is an eigenvalue of $BA$ also.

Otherwise $\alpha\neq0$. Suppose $\alpha$ is not an eigenvalue of $BA$. Then $0\neq\det(\alpha I-BA)=\alpha^n\det(I-(\frac{1}{\alpha}B)A)$. Then $0\neq\det(I-(\frac{1}{\alpha}B)A),$ so that $I-(\frac{1}{\alpha}B)A$ is invertible. By the lemma above we know that $I-A(\frac{1}{\alpha}B)$ is invertible as well, meaning $0\neq\det(I-A(\frac{1}{\alpha}B))=\det(I-\frac{1}{\alpha}AB) \implies 0\neq\det(\alpha I-AB)$. But we assumed $\alpha$ to be an eigenvalue for $AB$, $\unicode{x21af}$.

$\endgroup$
5
  • 1
    $\begingroup$ Do we need the condition of invertibility of $A$ or $B$ here? $\endgroup$ Commented Dec 29, 2016 at 10:46
  • 1
    $\begingroup$ @StubbornAtom No, the arguments I presented make no use of that condition ($M_n(\Bbb{F})$ is the set of all $n\times n$ matrices with entries from $\Bbb{F}$). $\endgroup$ Commented Dec 30, 2016 at 17:11
  • 3
    $\begingroup$ Amazing answer. Thank you.. $\endgroup$ Commented Jun 4, 2017 at 7:38
  • 1
    $\begingroup$ How u get $det(αI- BA) = α^n$ ? $\endgroup$ Commented Oct 11, 2017 at 6:51
  • $\begingroup$ The equation doesn't end there, and determinant is multilinear on columns. $\endgroup$ Commented Oct 12, 2017 at 0:27
12
$\begingroup$

Notice that $\lambda$ being an eigenvalue of $AB$ implies that $\det(AB-\lambda I)=0$ which implies that $$\det(A^{-1})\det(AB-\lambda I)\det(B^{-1})=0=\det(A^{-1}(AB-\lambda I)B^{-1})=\det((B-\lambda A^{-1})B^{-1}) $$ $$=\det(I-\lambda A^{-1}B^{-1}) = 0.$$ This further implies that $$\det(BA)\det(I-\lambda A^{-1}B^{-1})=\det(BA(I-\lambda A^{-1}B^{-1}))=\det(BA-\lambda I)=0,$$ i.e., $\lambda$ is an eigenvalue of $BA$. This proof holds only for invertible matrices $A$ and $B$ though. For singular matrices you can show that $0$ is a common eigenvalue, but I can't think of a way to show that the rest of the eigenvalues are equal.

$\endgroup$

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.