0
$\begingroup$

Let $X$ be a $d$-dimensional random vector drawn from a Gaussian mixture
$$ X \sim \sum_{k=1}^K \pi_k \, \mathcal{N}_d(\mu_k, \Sigma_k), $$ and let
$$ Y = X + N, \quad N \sim \mathcal{N}_d(0, \Sigma_N), $$ with $N$ independent of $X$.

The MMSE estimator is
$$ \hat{X}(Y) = \mathbb{E}[X \mid Y], $$ and I want the expected squared error
$$ \mathbb{E}\!\left[\|X - \hat{X}(Y)\|^2\right]. $$

I can derive this quantity directly (it involves the posterior mixture weights and Gaussian integrals), but the expression is long and not central to my main topic. I’m writing a scientific paper where this is only a technical intermediate result, so I’d prefer to cite a standard reference, i.e., a book or article that explicitly presents or derives the MMSE (expected error) for a Gaussian mixture prior with additive Gaussian noise. It would also save me greatly on the space in the submission, which is of course limited.

Could anyone point me to such a reference in estimation theory, Bayesian statistics, or signal processing?

$\endgroup$

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.