1
$\begingroup$

I have measurements of 2 position vectors ($\mathbf p_1$ and $\mathbf p_2$):

  • Each with their own mean position vectors $(\overline x_1, \overline y_1, \overline z_1)^T$ and $(\overline x_2,\overline y_2,\overline z_2)^T$ respectively,
  • Each with their own $3 \times 3$ variance-covariance matrices ($\Sigma_1$ and $\Sigma_2 $) respectively.
  • $\mathbf p_1$ and $\mathbf p_2$ are independent.

How do I find the variance and covariance of ($\mathbf p_2 - \mathbf p_1$)? In other words, what is the variance and covariance of relative position vector $(\overline x_2 - \overline x_1, \overline y_2 - \overline y_1, \overline z_2 - \overline z_1)^T$?

$\endgroup$
7
  • $\begingroup$ Welcome to Math.SE! Can you expand on what you already know yourself? For example, what is the definition of variance and covariance? $\endgroup$ Commented May 5, 2015 at 9:48
  • $\begingroup$ When I said variance, I mean $Var()$ and when I said covariance, I mean $Cov()$ $\endgroup$ Commented May 5, 2015 at 10:00
  • $\begingroup$ I am sorry, but that is extremely non-informative. What does $Var()$ mean? $\endgroup$ Commented May 5, 2015 at 10:21
  • $\begingroup$ Yes. You are right. It is not so informative. Sorry for that. Let me rephrase of what I meant. Variance of relative position vector means the diagonal terms of variance-covariance matrix, while covariance of relative position vector means off-diagonal terms of variance-covariance matrix. I hope this is acceptable. $\endgroup$ Commented May 5, 2015 at 12:28
  • $\begingroup$ For me this still does not mean anything, because I don't know what the variance-covariance matrix is. This might be general knowledge in your field, I cannot tell. $\endgroup$ Commented May 5, 2015 at 14:25

1 Answer 1

0
$\begingroup$

At least if $\mathbf p_1$ and $\mathbf p_2$ are each multivariate (incl. bivariate) normal, then, as they are independent, you can just sum the covariance matrices to get the covariance matrix of the relative position vector:

$\Sigma_{2-1} = \Sigma_1 + \Sigma_2 $

for the derivation see e.g.: Multivariate Normal Difference Distribution

right now I am not unfortunately not sure about the general case (i.e. other distributions).

$\endgroup$
3
  • 1
    $\begingroup$ Do you means that I can obtain variance-covariance matrix of relative position vector as follows? $$\sigma^2_{x_{2-1}} = \sigma^2_{x_2} + \sigma^2_{x_1} $$ $$\sigma^2_{y_{2-1}} = \sigma^2_{y_2} + \sigma^2_{y_1} $$ $$\sigma^2_{z_{2-1}} = \sigma^2_{z_2} + \sigma^2_{z_1} $$ and $$Cov({x_{2-1}},{y_{2-1}}) = Cov({x_2},{y_2}) + Cov({x_1},{y_1})$$ $$Cov({x_{2-1}},{z_{2-1}}) = Cov({x_2},{z_2}) + Cov({x_1},{z_1})$$ $$Cov({y_{2-1}},{z_{2-1}}) = Cov({y_2},{z_2}) + Cov({y_1},{z_1})$$ $\endgroup$ Commented May 5, 2015 at 10:15
  • $\begingroup$ Yes, that's how I have understood it. If $p_1$ and $p_2$ are each multivariate normal, you can. For other distributions I'm just not sure at the moment. $\endgroup$ Commented May 5, 2015 at 10:26
  • $\begingroup$ think I missed the title - where it says bivariate, which is a special case of multivariate, so, yes, here you can just sum the covariance matrices. $\endgroup$ Commented May 5, 2015 at 13:24

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.