3
$\begingroup$

I've gotten a lot of usage out of principal component analysis, and after recently learning the basics of performing canonical polyadic decomposition I was intrigued to learn that there exists a multilinear principal components analysis (MPCA).

The trouble is, the wikipedia page doesn't really clarify what it is mathematically. What it says is:

MPCA computes a set of orthonormal matrices associated with each mode of the data tensor which are analogous to the orthonormal row and column space of a matrix computed by the matrix SVD. This transformation aims to capture as high a variance as possible, accounting for as much of the variability in the data associated with each data tensor mode(axis).

On the face of it this usage of orthogonal matrices along modes of the tensor sounds similar to higher-order singular value decomposition which is a special case of Tucker decomposition.

What is multilinear principal components analysis?


I just tracked down this lengthy paper which might hold the answer.

$\endgroup$

3 Answers 3

1
$\begingroup$

Here are some exerpts from Lu, Plataniotis and Venetsanopoulos 2006.

Let $\{ \mathcal{A}_m, m=1,\cdots, M \}$ be a set of $M$ tensor samples in $\mathbb{R}^{I_1} \otimes \mathbb{R}^{I_2} \otimes \cdots \mathbb{R}^{I_N}$. The total scatter of these tensors is defined as: $\Psi_{\mathcal{A}} = \sum_{m=1}^M \vert| \mathcal{A}_m - \bar{\mathcal{A}} \vert|_F^2$, where $\bar{\mathcal{A}}$ is the mean tensor calculated as $\bar{\mathcal{A}} = \frac{1}{M} \sum_{m=1}^M \mathcal{A}_m$. The $n$-mode total scatter matrix of these samples is then defined as: $\mathbf{S}_{T_{\mathcal{A}}}^{(n)} = \sum_{m=1}^M (\mathbf{A}_{m(n)} - \bar{\mathbf{A}_{(n)}}) (\mathbf{A}_{m(n)} - \bar{\mathbf{A}_{(n)}})^T$, where $\mathbf{A}_{m(n)}$ is the $n$-node unfolded matrix of $\mathcal{A}_m$. The statement above leads to the following formal definition of the problem to be solved:

A set of $M$ tensor objects $\{ \chi_1, \chi_2, \cdots, \chi_M \}$ is available for training. Each tensor object $\chi_m \in \mathbb{R}^{I_1 \times I_2 \times \cdots \times I_N}$ assumes values in a tensor space $\mathbb{R}^{I_1} \otimes \mathbb{R}^{I_2} \otimes \cdots \mathbb{R}^{I_N}$, where $I_n$ is the $n$-mode dimension of the tensor. The MPCA objective is to define a multilinear transformation $\{ \tilde{\mathbf{U}}^{(n)} \in \mathbb{R}^{I_n \times P_n, n = 1, \cdots, N} \}$ that maps the original tensor space $\mathbb{R}^{I_1} \otimes \mathbb{R}^{I_2} \otimes \cdots \mathbb{R}^{I_N}$ into a tensor subspace $\mathbb{R}^{P_1} \otimes \mathbb{R}^{P_2} \otimes \cdots \mathbb{R}^{P_N}$ (with $P_n < I_n$, for $n=1,\cdots,N$): $\mathcal{Y}_m = \chi_m \times _1\tilde{\mathbf{U}}^{(1)^T} \times _2\tilde{\mathbf{U}}^{(2)^T} \cdots \times _N \tilde{\mathbf{U}}^{(N)^T}$, $m=1, \cdots, M$, such that $\{ \mathcal{Y}_m \in \mathbb{R}^{P_1} \otimes \mathbb{R}^{P_2} \otimes \cdots \mathbb{R}^{P_N}, m=1, \cdots, M \}$ captures most of the variation observed in the original tensor objects, assuming that these variation are measured by the total scatter.

In other words, the MPCA objective is the determination of the $N$ projection matrices $\{ \tilde{\mathbf{U}}^{(n)} \in \mathbb{R}^{I_n \times P_n}, n=1,\cdots,N \}$ that maximizes the total tensor scatter $\Psi_{\mathcal{Y}}$: $$\{ \tilde{\mathbf{U}}^{(n)}, n=1, \cdots, N \} = \arg\max_{\tilde{\mathbf{U}^{(1)}}, \tilde{\mathbf{U}^{(2)}}, \cdots, \tilde{\mathbf{U}^{(N)}}} \Psi_{\mathcal{Y}}.$$

I guess this technically answers the question, but I am going to have to ruminate on this some more.

$\endgroup$
0
$\begingroup$

A different way to think about it is to write down the factor model for the tensor $$Y = \sum_{r=1}^Rv_{1,r}\otimes v_{2,r}\otimes\dots\otimes v_{d,r} + U$$ This factor model is the CP decomposition + noise. Then it is possible to estimate the loadings and factors in this model applying the standard PCA (or SVD) to the flattened tensors, see https://junsupan.github.io and https://arxiv.org/abs/2212.12981 for more details.

$\endgroup$
5
  • $\begingroup$ At a glance it looks like the tensor principal component analysis described on the blog post is equal to the canonical polyadic decomposition plus an error term. Is that description correct? $\endgroup$ Commented Nov 21, 2023 at 18:18
  • $\begingroup$ Yes, that's what the factor model is: SVD + noise. en.wikipedia.org/wiki/Factor_analysis $\endgroup$ Commented Nov 21, 2023 at 18:23
  • $\begingroup$ Oh yes, I think I see that. So SVD, which decomposes matrices, is being used on a reshaped (i.e. matricized) tensor to achieve generalizing PCA to d-mode/order tensors? $\endgroup$ Commented Nov 21, 2023 at 18:47
  • $\begingroup$ Perhaps you can edit that information into the answer, along with any other clarifying comments about this method. As it stands, this answer is a bit skinny. $\endgroup$ Commented Nov 21, 2023 at 22:39
  • $\begingroup$ Thank you, added. $\endgroup$ Commented Nov 21, 2023 at 22:49
-1
$\begingroup$

Here is a gentle introduction to Multilinear Principal Component Analysis (2002) and Multilinear Independent Component analysis(2005).

MPCA is computed with the N-mode SVD algorithm: https://www.media.mit.edu/~maov/tensorfaces/eccv02_corrected.pdf

MICA is computed with the N-mode ICA algorithm: https://www.media.mit.edu/~maov/mica/mica05.pdf

TensorFaces -- Multilinear PCA for face recognition

N-mode SVD algorithm is employed to perform Multilinear Principal Component Analysis (MPCA).

M-mode SVD algorithm is employed to perform Multilinear PCA

MPCA computed with the M-mode SVD algorithm when images are vectorized and organized into a data tensor versus when images are treated as "matrix-valued". Most arguments in favor of treating an image as a "matrix-valued" do not stand up to analytical scrutiny.

MPCA on image as a vector (ie a single observation) versus image as "matrix-valued".

"Causal Deep Learning", M. Alex O. Vasilescu, In the Proceedings of the 2024 27h International Conference on Pattern Recognition (ICPR 2024)

$\endgroup$
0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.