I have a time-varying signal $f(t)$. Couple of probes are trying to measure its values across different time intervals:
\begin{align} X_1 &= f(t = t_0) + e_1 \\ X_2 &= f(t = t_0+h) + e_2 \end{align}
$e_1$ and $e_2$ are two independent random variables with zero mean and variance = $\sigma^2$. Goal is to estimate the rate of change of the signal using following estimator
$$R = (X_2 - X_1)/h$$
How should I calculate $E(R)$ and $\text{Var}(R)$?