ABSTRACT

A modification of underlying data is called ill-posed if a large change in an estimand yields a relatively small change in the modified data which, in its turn, slows down the rate of a risk convergence with respect to estimation based on the underlying data. A classical example of ill-posed modification is contaminating data by additive errors, traditionally referred to as measurement errors. In particular, normal measurement errors may decrease the MISE convergence to logarithmic rates. Further, in some cases measurement errors imply a destructive modification which makes consistent estimation impossible. As it will be explained shortly, the problem of density estimation for data contaminated by measurement errors may be also referred to as a deconvolution problem, and there is a rich mathematical and statistical literature exploring deconvolution problems. Regression with predictors contaminated by additive errors is another classical example of ill-posed modification. Ill-posedness may also occur if a directly observed sample does not correspond to the problem at hand.