Showing 1 - 10 of 329
Estimation of a regression function is a well-known problem in the context of errors in variables, where the explanatory variable is observed with random noise. This noise can be of two types, which are known as classical or Berkson, and it is common to assume that the error is purely of one of...
Persistent link: https://www.econbiz.de/10005203034
Persistent link: https://www.econbiz.de/10008783994
Persistent link: https://www.econbiz.de/10008784053
Persistent link: https://www.econbiz.de/10004982623
It is common, in errors-in-variables problems in regression, to assume that the errors are incurred 'after the experiment', in that the observed value of the explanatory variable is an independent perturbation of its true value. However, if the errors are incurred 'before the experiment' then...
Persistent link: https://www.econbiz.de/10005203036
In this note we show that, from a conventional viewpoint, there are particularly close parallels between optimal-kernel-choice problems in non-parametric deconvolution, and their better-understood counterparts in density estimation and regression. However, other aspects of these problems are...
Persistent link: https://www.econbiz.de/10005254525
Persistent link: https://www.econbiz.de/10010543899
Persistent link: https://www.econbiz.de/10009210413
In the present study, we consider the problem of classifying spatial data distorted by a linear transformation or convolution and contaminated by additive random noise. In this setting, we show that classifier performance can be improved if we carefully invert the data before the classifier is...
Persistent link: https://www.econbiz.de/10010605415
Persistent link: https://www.econbiz.de/10005532839