3
$\begingroup$

Let $\varepsilon_1, \dots, \varepsilon_n$ be independent random variables with $E(\varepsilon_i) = 0$. Let $f: [0,1] \to \mathbb{R}$ be a Lipschitz function with constant $K > 0$, i.e., $$|f(x) - f(y)| \leq K |x - y| \quad \forall x, y \in [0,1].$$ Define $$Y_i = f\!\left(\frac{i}{n}\right) + \varepsilon_i, \quad i = 1, \dots, n.$$ Consider the estimator $$\hat{\sigma}_n = \frac{1}{2(n-1)} \sum_{i=1}^{n-1} (Y_{i+1} - Y_i)^2.$$ Prove that $$\lim_{n \to \infty} E(\hat{\sigma}_n) = \operatorname{var}(\varepsilon_1), \qquad \lim_{n \to \infty} \operatorname{var}(\hat{\sigma}_n) = 0.$$ From this, deduce $$\lim_{n \to \infty} E\!\left[ \left( \hat{\sigma}_n - \operatorname{var}(\varepsilon_1) \right)^2 \right] = 0.$$ I am getting stuck with the part $\lim_{n \to \infty} \operatorname{var}(\hat{\sigma}_n) = 0$, hope anyone help with any ideas . I very appreciate that.

$\endgroup$

1 Answer 1

0
$\begingroup$

Firstly, I think you need some stronger assumptions to prove the required results, i.e. the $\varepsilon_i$ being i.i.d. with finite fourth moment?

Now, suppose $A_n$ and $B_n$ are two sequences of random variables with $\lim_{n\to\infty}\mathrm{var}(A_n) = 0$, $\lim_{n\to\infty}\mathrm{var}(B_n) = 0$. Then it follows that also $\lim_{n\to\infty}\mathrm{var}(A_n + B_n) = 0$. Prove this! Then split $\hat\sigma_n$ into a sum of simpler sequences and show that each of those sequences has variance that vanishes in the limit.

$\endgroup$
2
  • $\begingroup$ thanks for your answer. In this case, we don't have the condition relate finite fourth moment, so I think we have to prove it $\endgroup$ Commented Nov 2 at 0:57
  • $\begingroup$ @Wei Here the number of terms in the sum is not fixed.So the convergence of $\mathrm{var}((Y_{i+1}-Y_i)^2/(2(n-1)))$ is not sufficient. $\endgroup$ Commented Nov 2 at 10:04

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.