1
$\begingroup$

As the question states: Given $M \in \mathbb{C}^{n \times n}$ where $\det(M) = 0$. Does there exist a real matrix $R \in \mathbb{R}^{n \times n}$ and invertible matrix $E \in \mathbb{C}^{n \times n}$ such that $M = ER$? If not what is required of $M$ for $R, E$ to exist?

Such a decomposition is always possible when $M$ is invertible but I wanted to know what would happen if $M$ is singular. There is a similar decomposition called the coninvolution decomposition according to the paper A Real-Coninvolutory Analog of the Polar Decomposition.

However, from reading the paper, coninvolution decomposition is true for singular matrices if there exists an invertible $X$ where $M = X\bar{M}$. However, it does not go on how one would find $X$ given $M$. Do note that the paper formulates the question as a decomposition $M = RE$.

The question I am asking is a bit more general compared to the paper. However, if the original statement is true, then one can find a coninvolution decomposition for any square complex matrix which seems contradictory to the paper. If it helps, you can assume that $M$ is Hermitian or an upper triangular complex matrix.

$\endgroup$
0

1 Answer 1

1
$\begingroup$

Let $M=A+iB$ where $A$ and $B$ are real matrices. The decomposition $M=ER$ is possible iff $$ \operatorname{rank}M=\operatorname{rank}\pmatrix{A\\ B}.\tag{1} $$ In the above, whether or not $M$ is singular is irrelevant (although there is no need to ask your question when $M$ is nonsingular, because there is a trivial solution $(E,R)=(M,I)$). Also, if you want the decomposition $M=RE$ instead of $M=ER$, condition $(1)$ should be modified to $\operatorname{rank}M=\operatorname{rank}\pmatrix{A&B}$.

Example. Consider $M=\pmatrix{1&i\\ 0&0}$. This is a rank-one matrix. If it admits the decomposition $ER$, then $R$ must also be rank-one. Therefore $R=uv^T$ for some nonzero real vectors $u$ and $v$. In turn, the row space of $M=Euv^T$ is the span of $v^T$. But this is impossible because the first row of $M$, namely $\pmatrix{1&i}$, is not a scalar multiple of any real vector. In this example one can verify that $$ \operatorname{rank}M=1<\operatorname{rank}\pmatrix{A\\ B}=\operatorname{rank}\pmatrix{1&0\\ 0&0\\ 0&1\\ 0&0}=2. $$

Proof of the necessity and sufficiency of $(1)$. Suppose $M=ER$. Let $E=X+iY$ where $X$ and $Y$ are real. By comparing both sides of the equality $M=ER$, we get $A=XR,\,B=YR$. Hence we obtain $(1)$ by the sandwich principle: \begin{align*} \operatorname{rank}M =\operatorname{rank}\left[\pmatrix{I&iI}\pmatrix{A\\ B}\right] \le\operatorname{rank}\pmatrix{A\\ B} =\operatorname{rank}\pmatrix{XR\\ YR} \le\operatorname{rank}R =\operatorname{rank}M. \end{align*}

Conversely, suppose $(1)$ is satisfied. Let $r=\operatorname{rank}(M)$. Then $s:=n-r=\dim\ker\pmatrix{A\\ B}$. Hence there exists a real invertible matrix $Q$ such that both $AQ$ and $BQ$ are in the form of $$ \pmatrix{\ast&0_{r\times s}\\ \ast&0_{s\times s}} $$ and the polynomial matrix $(A+xB)Q$ (where $x$ be an indeterminate) also assumes the same form. Since $\operatorname{rank}(A+iB)=\operatorname{rank}(M)=r$, in the block column marked by two asterisks above, some $r$-rowed minor of $(A+iB)Q$ is nonzero, and the corresponding minor in $(A+xB)Q$ is a nonzero polynomial. Consequently, this minor is nonzero for $x=i$ and for $x=$ some nonzero $t\in\mathbb R$. Hence there exists a real invertible matrix $P$ such that $$ P(A+tB)Q=\pmatrix{\ast&0_{r\times s}\\ 0_{s\times r}&0_{s\times s}} $$ where the subblock marked by an asterisk is invertible. But that means $PAQ$ and $PBQ$ are necessarily of the forms $$ PAQ=\pmatrix{A'&0\\ -tC'&0}\quad\text{and}\quad PBQ=\pmatrix{B'&0\\ C'&0} $$ for some $A',B'\in M_r(\mathbb R)$, $C\in M_{s,r}(\mathbb R)$ such that both $A'+tB'$ and $A'+iB'$ are invertible. Now let $$ PXQ=\pmatrix{A'&0\\ -tC'&I_s},\quad PYQ=\pmatrix{B'&0\\ C'&0},\quad Q^{-1}RQ=\pmatrix{I_r\\ &0}. $$ Then $X+iY=P^{-1}\pmatrix{A'+iB'&0\\ (i-t)C'&I_s}Q^{-1}$ is invertible, $XR=A,\,YR=B$. Hence $M=ER$.

$\endgroup$
2
  • $\begingroup$ Could you go further into how one would find such a P and Q? (I think for Q it is the polar decomposition). Additionally, could you explain what it would mean for the matrix M when rank(M) < rank([A, B]^T)? I would also be happy to just reference papers if a direct answer is not possible. $\endgroup$ Commented Jul 3 at 4:16
  • $\begingroup$ If $\pmatrix{A\\ B}=USV^T$ is a SVD, you may set $Q=V$. The matrix $P$ is obtained in two steps. In the first step, find a permutation matrix $M$ that moves the submatrix with the nonzero $r$-rowed minor in $A+iB$ to the top, so that the leading principal $r\times r$ submatrices of $M(A+iB)Q$ and $M(A+tB)Q=\pmatrix{F&0\\ G&0_{s\times s}}$ are invertible. Then set $P=\pmatrix{I_r&0\\ -GF^{-1}&I_s}$. $\endgroup$ Commented Jul 4 at 8:41

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.