Asymptotic lower bounds in estimating jumps.

Authors
Publication date
2014
Publication type
Journal Article
Summary We study the problem of the efficient estimation of the jumps for stochastic processes. We assume that the stochastic jump process $(X_t)_{t \in [0,1]}$ is observed discretely, with a sampling step of size $1/n$. In the spirit of Hajek's convolution theorem, we show some lower bounds for the estimation error of the sequence of the jumps $(\Delta X_{T_k})_k$. As an intermediate result, we prove a LAMN property, with rate $\sqrt{n}$, when the marks of the underlying jump component are deterministic. We deduce then a convolution theorem, with an explicit asymptotic minimal variance, in the case where the marks of the jump component are random. To prove that this lower bound is optimal, we show that a threshold estimator of the sequence of jumps $(\Delta X_{T_k})_k$ based on the discrete observations, reaches the minimal variance of the previous convolution theorem.
Publisher
Bernoulli Society for Mathematical Statistics and Probability
Topics of the publication
Themes detected by scanR from retrieved publications. For more information, see https://scanr.enseignementsup-recherche.gouv.fr