0
$\begingroup$

Let $X_{1}, X_{2}, \ldots, X_{n}$ be a sample from distribution $F(x; \mu, \sigma)$ where $\mu$ and $\sigma$ are location and scale parameters respectively and let $X_{1:n}, X_{2:n}, \ldots, X_{n:n}$ be the corresponding order statistics of the sample and define $$Z_{r:n} = \frac{ X_{r:n} - \mu }{\sigma}$$ with $E(Z_{r:n}) = \alpha$, $\operatorname{Var}(Z_{r:n}) = V_{r,r}$ and $\operatorname{Cov}(Z_{r:n}, Z_{s:n}) = V_{r,s}$.

Now $X_{r:n} = \mu + \sigma Z_{r:n}$ we have

$E(X_{r:n}) = \mu + \sigma \alpha_{r}$ $\operatorname{Var}(X_{r:n}) = \sigma^{2}V_{r,r}$ and $\operatorname{Cov}(X_{r:n}, X_{s:n}) = \sigma^2 V_{r,s}$.

Now writing $z = [ Z_{1:n},Z_{2:n}, \ldots, Z_{n:n} ]'$ we have $x = \mu 1 + \sigma z$,

$E(z) = \alpha$, $\text{Cov} = \text{V}$ and $E(x) = \mu 1 + \sigma \alpha$ and $\operatorname{Cov}(x) = \sigma^2 V$.

where $1$ is $(n \times 1)$ vector of $1$’s, $\alpha$ is $(n \times 1)$ vector of $E(Z_{r:n})$ and $V$ is $(n \times n)$ matrix of variances and covariances of $Z_{r:n}$. Using the fact that least square estimate of parameters of the model $y = X \beta + \epsilon$ with $\operatorname{Cov}(\epsilon) = V$ are $\hat{\beta} = (X' V^{-1} X )^{-1} X' V^{-1} y $. The estimates of $\mu$ and $\sigma$ are

$E(x) = \mu 1 + \sigma \alpha$ then $E(x) = \begin{bmatrix} 1 & \alpha \end{bmatrix} \begin{bmatrix} \mu \\ \sigma \end{bmatrix} = X \beta$

After simplification

$ \hat{\mu} = \frac{1}{\delta} \{ -\alpha' V^{-1} (1\alpha' - \alpha 1') V^{-1} y \} $

and

$ \hat{\sigma} = \frac{1}{\delta} \{ 1' V^{-1} (1\alpha' - \alpha 1') V^{-1} y \}, $

where $\delta = (1' V^{-1}1) (\alpha' V^{-1} \alpha) - (1' V^{-1} \alpha)^2$

If the parent distribution is symmetrical, then we have $\alpha_{r} = -\alpha_{n+r-1}$ and hence we have $1' V^{-1} \alpha = 0$ and hence the ordered least square estimators reduces to $\hat{\mu} = \dfrac{ 1' V^{-1} y }{ 1' V^{-1} 1 }$ and $\hat{\sigma} = \dfrac{ \alpha' V^{-1} y }{ \alpha' V^{-1} \alpha }$.

  • Kindly give analytical explanation in detail of this argument (If the parent distribution is symmetrical then we have $\alpha_r = -\alpha_{n+r-1}$ and hence we have ${ 1' V^{-1} \alpha = 0}$)
$\endgroup$

1 Answer 1

0
$\begingroup$

It is not difficult to show that "If the parent distribution is symmetrical, then we have $\alpha_{r} = -\alpha_{n+r-1}$".

Since $X=\mu+(X-\mu)$, symmetry of the distribution (around the mean of $\mu$) means that $Y=\mu-(X-\mu)=2\mu-X$ has the same distribution as $X$. Moreover, the mapping $Y=2\mu-X$ reverses the order (it sends the maximum of $X$ to the minimum of $Y$ and vice versa). In general, $Y_{r:n}=2\mu-X_{n+r-1:n}$ and so $Y_{r:n}-\mu=-(X_{n+r-1:n}-\mu)$. Taking expectations on both sides, we get $E(Y_{r:n}-\mu)=-E(X_{n+r-1:n}-\mu)$. Since the distributions of $X$ and $Y$ are the same, we can write the last equality as $E(X_{r:n}-\mu)=-E(X_{n+r-1:n}-\mu)$. Dividing both sides by $\sigma$, we get $\alpha_r = - \alpha_{n+r-1}$.

Please see whether you can use these ideas to derive $1' V^{-1} \alpha = 0$

$\endgroup$

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.