1
$\begingroup$

Let $A \in \mathbb{C}^{n \times n}$ be an upper triangular matrix that satisfies $A^{*}A=AA^{*}$. Prove that $A$ must be diagonal.

My attempt is to partition $A$ as follows:

$$ A = \left[\begin{array}{cc} a_{11} & \alpha\\ 0 & \hat{A}\end{array}\right] $$

where $\alpha = (a_{12}, a_{13}, \cdots , a_{1n})$ and $\hat{A}$ is $A$ with the first row and column removed. Using this partitioning, we have:

$$ A^{*}A = \left[\begin{array}{cc} a_{11}^{2} & a_{11}\alpha\\ a_{11}\alpha^{*} & \alpha^{*}\alpha + \hat{A}^{*}\hat{A}\end{array}\right] $$ $$ AA^{*} = \left[\begin{array}{cc} a_{11}^{2} + \alpha\alpha^{*} & \alpha\hat{A}^{*}\\ \hat{A}\alpha^{*} & \hat{A}\hat{A}^{*}\end{array}\right] $$

Examining entry (1,1) of each of these matrix products, we see that $\alpha\alpha^{*} = 0$. From this, I would like to conclude that $\alpha = 0$ and thus, the first row of $A$ has non-zero entry only at (1,1). Then repeat this process continuously on $\hat{A}$.

However, I can see a flaw in my argument. If the entries of A were real, then this argument seems like it would work. But since the entries can be complex, this means that $\alpha\alpha^{*} = 0$ even with $\alpha \ne 0$. For example, $\alpha = (1, i, 1, i)$ gives $\alpha\alpha^{*} = 0$.

Any ideas how to proceed here? I know that with this partitioning, I must have $\alpha = 0$ since the problem statement is true (i.e. $A$ is diagonal).

$\endgroup$
3
  • 4
    $\begingroup$ $A^\ast$ denotes the adjoint (aka the conjugate transpose), not the straight-up transpose. Hence, in particular, since $\alpha$ is a complex row vector, $\alpha \alpha^\ast = \|\alpha\|^2$, where $\|v\| = \sqrt{|v_1|^2+\cdots+|v_n|^2}$ denotes the usual norm on $\mathbb{C}^n$. $\endgroup$ Commented Jan 27, 2014 at 4:45
  • 4
    $\begingroup$ Note that "$*$" usually means not just transpose but conjugate transpose. So your counterexample isn't actually a counterexample. $\alpha\alpha^*$ is actually the squared norm of $\alpha$, and one of the properties of a norm is that $\|\alpha\|=0$ implies that $\alpha = 0$. So you actually have the right idea in your proof. $\endgroup$ Commented Jan 27, 2014 at 4:45
  • $\begingroup$ As the previous comment said, your proof is absolutely correct. Squirtle's prompt answer below was identical. $\endgroup$ Commented Sep 19, 2022 at 9:13

3 Answers 3

1
$\begingroup$

First look at the first row $(i=1)$:

$$(AA^*)_{(1,:)}=\sum_{j} A_{1,j} \bar{A}_{j,1} = A_{1,1}\bar{A}_{1,1} + 0 + \cdots = A_{1,1}^2 = \sum_{j} \bar{A}_{j,1} A_{1,j} = (A^*A)_{(1,:)}$$

$$\iff$$

$$A_{1,j} = 0 \forall j\neq i(=1)$$

Repeat for every row, $i$.

QED

$\endgroup$
1
  • $\begingroup$ The equals sign at the very end is true a priori because we are assuming that $AA^*=A^*A$; this implies that the next to last equals sign must also be true. (i.e. c=e, e=f, implies c=f). $\endgroup$ Commented Jan 27, 2014 at 5:47
1
$\begingroup$

Partition $A$ as $$ A = \begin{pmatrix} A_{11} & A_{12}\\0 & A_{22}\end{pmatrix}. $$ If $A$ is normal, then $$ AA^* = \begin{pmatrix} A_{11}A_{11}^* + A_{12}A_{12}^* & \star\\\star & \star \end{pmatrix} = \begin{pmatrix} A_{11}^*A_{11} & \star\\\star & \star \end{pmatrix} = A^*A, $$ so $A_{11}^*A_{11} = A_{11}A_{11}^* + A_{12}A_{12}^*$. Now this means $$ \operatorname{tr}(A_{11}^*A_{11})= \operatorname{tr}(A_{11}A_{11}^*)= \operatorname{tr}(A_{11}A_{11}^* + A_{12}A_{12}^*) = \operatorname{tr}(A_{11}A_{11}^*) + \operatorname{tr}(A_{12}A_{12}^*) \implies \operatorname{tr}(A_{12}A_{12}^*) = 0 \implies A_{12}=0. $$ So now we have established that $A$ is block diagonal. Further, $A_{11}$ and $A_{22}$ are normal. Proceed by induction on $A_{11},A_{22}$, and further blocks to conclude that $A$ is in fact diagonal.

$\endgroup$
2
  • 1
    $\begingroup$ Is reaching out for the trace really needed? Your 3rd equation line yields $A_{12}\left(A_{12}\right)^* = 0$ which is satisfied (if and) only if each entry in $A_{12}$ is zero. $\endgroup$ Commented Aug 15, 2022 at 9:52
  • $\begingroup$ @Hanno You've just made me realize that I made a mistake. $\endgroup$ Commented Aug 15, 2022 at 10:00
1
$\begingroup$

The trace of $AA^*$ is $\sum_{i,j}|a_{i,j}|^2$.

On the other hand, since $A$ commutes with $A^*$ it is normal, so unitarily diagonalizable, and the eigenvalues of $AA^*$ are the numbers $|\lambda|^2$ with $\lambda$ an eigenvalue of $A$. Now $A$ is triangular, so the egienvalues of $A$ are the numbers $a_{1,1}$, $\dots$, $a_{n,n}$, and we see that the trace of $AA^*=\sum_i|a_{i,i}|^2$.

Comparing the two descriptions of the trace of $AA^*$ we see that $A$ is diagonal.

$\endgroup$
4
  • $\begingroup$ While correct, I would argue that this is not a "go to" proof, because the spectral theorem is itself proven based on the fact that a triangular normal matrix is diagonal. $\endgroup$ Commented Sep 19, 2022 at 9:37
  • $\begingroup$ @V.S.e.H. you are free to be wrong, of course :-) There is absolutely no need to base the proof of the spectral theorem on that, and that is actually very inconvenient. If $A$ is normal, then you can easily find a non-zero vector $v$ that is an eigenvector for $A$ and for $A^*$, and the orthogonal complement $v^\perp$ is invariant under $A$ and $A^*$, so you can do induction there. $\endgroup$ Commented Sep 19, 2022 at 20:41
  • $\begingroup$ It's not about being right or wrong, it's about having a self-contained proof that does not rely on other results, except the assumption that $A$ is normal, which is not the case in your proof, hence why I don't consider it a "go to" :P And, while I concede that proving the spectral theorem does not need to go trough the usual Schur triangularization step, I do find this way of proving it extremely convenient and intuitive, and it seems that this is the go to proof in most textbooks, but I guess we can agree to disagree on this part. $\endgroup$ Commented Sep 19, 2022 at 21:08
  • $\begingroup$ That proof completely obscures the fact that the diagonalizability of normal maps entirely similar to that of unitary ones — the argument I sketched is exactly the same one that one uses for unitrary matrices, for which the fact $A$ shares with $A^*$ an eigenvector is more obvious. $\endgroup$ Commented Sep 19, 2022 at 21:22

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.