2
$\begingroup$

I have dataset composed of a large number of images with a large size (i.e., 32x32px), and I'm trying to adapt a feature extraction framework which assumes that the input dataset is Multivariate Normal distributed.

My question is: is applying a 2D-FFT to the whole dataset a good way to transform the data-distribution to a Multivariate Normal (by Central Limit Theorem arguments)?

In other words, if $X$ is an image of the data-set, I want that its transformation $\tilde{X}$ is such that $\tilde{X} \sim \mathcal{N}(\mu,\Sigma)$, where $\mu \in \mathbb{R}^{1024}$ and $\Sigma \in \mathbb{R}^{1024\times1024}$.

If not, is there another possible way to do that?

$\endgroup$
2
  • $\begingroup$ What does it mean multivariate in this context? Do you mean each image is sampled from a Gaussian Distribution of dimensions 1024? $\endgroup$ Commented Jul 21, 2023 at 9:29
  • $\begingroup$ Yes, I added some details to the question to make it more clear. $\endgroup$ Commented Jul 21, 2023 at 9:41

1 Answer 1

1
$\begingroup$

In statistics there are several transforms which tries to convert data into Gaussian Distribution.
They are called Power Transform with Box Cox being the most known.

There is also the Variance Stabilizing Transform (VST) which is usually used in the context of denoising (See Anscombe Transform for Poisson noise).

The Fourier Transform, being a linear operation, doesn't do that.
What you can say, if the data is Gaussian Vector then the DFT will also have Gaussian Distorbution.

$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.