Optimal global error measure approach to risk reduction in modern regression

Hong Pan, Purdue University

Abstract

We first review the concepts fundamental to the statistical inference procedures using nonparametric regression models. The global error properties of an estimator over its parameter space are employed to define a general framework that puts various existing optimality criteria and heuristics into a coherent and rigorous perspective. A class of Bayes robust and asymptotically minimax estimator is then constructed by comprehensively considering all the major aspects of their global error measures. This new estimator is shown to have a better risk behavior than the usual Least Squares and other Bayesian procedures, and to be robust with respect to misspecification of the prior assumption on the parameters, among several other desirable properties. Moreover, the related single-run algorithm does not incur extra computational cost, while delivering improved risk performance. As a case study, the prediction performance of the new widely applicable and well-balanced estimation procedure is then evaluated and compared critically on a class of generalized additive regression method, i.e., the feedforward neural network model.

Degree

Ph.D.

Advisors

Roychowdhury, Purdue University.

Subject Area

Computer science|Statistics

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS