The Rodrigues rotation formula gives the exponential of a skew-symmetric matrix in three dimensions, and the exponential of a skew-symmetric matrix in two dimensions is given by Euler's formula. Is there a general formula (or set of formulas) for the exponential of a skew-symmetric matrix in any dimension?
- $\begingroup$ See this paper by Kaiser: arxiv.org/abs/2308.12123 The formulae have no closed form even for modest dimensions. $\endgroup$lightxbulb– lightxbulb2025-04-25 09:37:10 +00:00Commented Apr 25 at 9:37
- $\begingroup$ Depends on what you mean by closed form. Here's a closed form solution in terms of singular values and vectors (mainline.brynmawr.edu/~dxu/206-2550-2.pdf) $\endgroup$Math Student– Math Student2025-06-09 21:36:42 +00:00Commented Jun 9 at 21:36
3 Answers
The spectral decomposition of any skew-symmetric matrix $A$ is given by $A=U Q U^\dagger$ where $U$ is unitary and \begin{align*} Q=\begin{bmatrix} 0 & \lambda_1 & \\ -\lambda_1 & 0 & \\ & & 0 & \lambda_2\\ & & -\lambda_2 &0\\ & & & & \ddots\\ & & & & & 0 & \lambda_r\\ & & & & & -\lambda_r & 0\\ & & & & & & & 0\\ & & & & & & & &\ddots\\ & & & & & & & & & 0 \end{bmatrix} \end{align*} where I don't put all the $0$s for the sake of readability's.
The exponential of a matrix is defined as the extension of the tailor expansion (up to convergence matter you will need to take care of), hence \begin{align*} e^A = \sum_{n=0}^\infty \frac{1}{n!} A^n \end{align*}
and since $U$ is unitary, $A^n = U Q U^\dagger \dots U Q U^\dagger=U Q^n U^\dagger$ so we aim to get an expression for $Q^n$, this is not trivial but the right way to go is to compute it for several $n$ and try to see a pattern and then prove the pattern is right. It is easy to prove by induction that for any $k\in\mathbb N$ that $Q^{2k}$ and $Q^{2k+1}$ are respectively $$ \begin{bmatrix} (-1)^k\lambda_1^{2k} & 0 & \\ 0 & (-1)^k\lambda_1^{2k} & \\ & & \ddots\\ & & & (-1)^k\lambda_r^{2k} & 0\\ & & & 0 & (-1)^k\lambda_r^{2k}\\ & & & & & 0\\ & & & & & &\ddots\\ & & & & & & & 0 \end{bmatrix}\\ \text{and}\\ \begin{bmatrix} 0 & (-1)^k\lambda_1^{2k+1} & \\ -(-1)^k\lambda_1^{2k+1} & 0 & \\ & & \ddots\\ & & & 0 & (-1)^k\lambda_r^{2k+1}\\ & & & -(-1)^k\lambda_r^{2k+1} & 0\\ & & & & & 0\\ & & & & & &\ddots\\ & & & & & & & 0 \end{bmatrix} $$ For this proof the base case would be $Q^0=I$ and $Q$ then you do induction on $k$.
We had \begin{align*} e^A &= \sum_{n=0}^\infty \frac{1}{n!} A^n\\ &= \sum_{n=0}^\infty \frac{1}{n!} U Q^n U^\dagger\\ &= U \left( \sum_{n=0}^\infty \frac{1}{n!} Q^n \right) U^\dagger \end{align*} and by plugin in the $Q^n$ we had, we get that the diagonal terms will be of the form $\sum_{k=0}^\infty \frac{1}{(2k)!} (-1)^k \lambda_p^{2k}=\cos(\lambda_p)$ and the other elements are of the form $\pm \sum_{k=0}^\infty \frac{1}{(2k+1)!} (-1)^k \lambda_p^{2k+1}=\pm \sin(\lambda_p)$ hence \begin{align*} e^A = U e^Q U^\dagger \end{align*} with \begin{align*} e^Q = \begin{bmatrix} \cos(\lambda_1) & \sin(\lambda_1) & \\ -\sin(\lambda_1) & \cos(\lambda_1) & \\ & & \cos(\lambda_2) & \sin(\lambda_2)\\ & & -\sin(\lambda_2) &\cos(\lambda_2)\\ & & & & \ddots\\ & & & & & \cos(\lambda_r) & \sin(\lambda_r)\\ & & & & & -\sin(\lambda_r) & \cos(\lambda_r)\\ & & & & & & & 1\\ & & & & & & & &\ddots\\ & & & & & & & & & 1 \end{bmatrix} \end{align*}
So an algorithmic way to find the exponential of your matrix is finding the spectral decomposition and then applying the last formula.
- $\begingroup$ put ones in the southeast corner of $e^Q$. $\endgroup$user91684– user916842019-01-10 11:04:57 +00:00Commented Jan 10, 2019 at 11:04
- $\begingroup$ @loupblanc thanks, can you confirm that this comes from the fact that $Q^0=I$ ? $\endgroup$P. Quinton– P. Quinton2019-01-10 12:42:06 +00:00Commented Jan 10, 2019 at 12:42
- $\begingroup$ Indeed I confirm. $\endgroup$user91684– user916842019-01-10 12:44:25 +00:00Commented Jan 10, 2019 at 12:44
- $\begingroup$ Note that this is also the real Schur decomposition of A. $\endgroup$Nichola– Nichola2022-04-18 02:48:56 +00:00Commented Apr 18, 2022 at 2:48
- 1$\begingroup$ @Nichola I guess the Schur decomposition takes this special form when the matrix is real skew-symmetric ? $\endgroup$P. Quinton– P. Quinton2022-04-18 16:12:41 +00:00Commented Apr 18, 2022 at 16:12
I may be wrong, but this could be more helpful in practice, at least for real matrices. We can split the exponential series into two subseries, for even and odd terms: $$ e^A = \sum_{k=0}^\infty \frac{A^k}{k!} = \sum_{k=0}^\infty \frac{A^{2k}}{(2k)!} + \sum_{k=0}^\infty\frac{A^{2k+1}}{(2k+1)!} $$ and setting $Q=A^2$: $$ e^A = \sum_{k=0}^\infty \frac{Q^{k}}{(2k)!} + A\sum_{k=0}^\infty\frac{Q^{k}}{(2k+1)!} $$
Now, since $A$ is skew-symmetric, $P = A A^T = -Q$, so we also get: $$ e^A = \sum_{k=0}^\infty (-1)^k \frac{P^{k}}{(2k)!} + A\sum_{k=0}^\infty (-1)^k\frac{P^{k}}{(2k+1)!} $$ and these are the series expansions for: $$ \sum_{k=0}^\infty (-1)^k \frac{P^{k}}{(2k)!} = \cos(P^{1/2}) \\ \sum_{k=0}^\infty (-1)^k\frac{P^{k}}{(2k+1)!} = \operatorname{sinc}(P^{1/2}) $$ so that $$ e^A = \cos(P^{1/2}) + A\,\operatorname{sinc}(P^{1/2}) $$ (Note it's $\operatorname{sinc}(x)=\sin(x)/x$, and $\operatorname{sinc}(0)=1$.)
Since $P$ is guaranteed to be symmetric and positive semi-definite, it is "easily" diagonalizable, and $\cos(P^{1/2})$ and $\operatorname{sinc}(P^{1/2})$ are trivial to compute after that.
(If you have an SVD of $A = U \sigma V^T$, that would be $e^A = U [\cos(\sigma) U^T + \sin(\sigma) V^T]$.)
EDIT: It may be worth expanding on this last comment. From the SVD of $A$ we automatically get the eigenvalue decomposition of $P$: $P = U \sigma^2 U^T$, so: $$ \cos(P^{1/2}) = U \cos(\sigma) U^T \\ \operatorname{sinc}(P^{1/2}) = U \operatorname{sinc}(\sigma) U^T $$ and because all the powers of $A$ commute, we could also have written: $$ \begin{align} e^A &= \cos(P^{1/2}) + \operatorname{sinc}(P^{1/2})\,A \\ &= U \cos(\sigma) U^T + U \operatorname{sinc}(\sigma) U^T U \sigma V^T \\ &= U \cos(\sigma) U^T + U \operatorname{sinc}(\sigma) \sigma V^T \\ &= U \cos(\sigma) U^T + U \sin(\sigma) V^T \end{align} $$ QED.
- $\begingroup$ Wow, fantastic answer. That last comment in parenthesis is gold! Thank you! $\endgroup$tommym– tommym2025-04-23 20:57:47 +00:00Commented Apr 23 at 20:57
Somebody should mention that $A$ skew symetric commutes with $-A$, implying that $U=e^A$ satisfies $UU^T=e^A e^{-A}=e^{A-A}=I_n,$ meaning that $U$ is orthogonal.