Fourier methods for sufficient dimension reduction in regression

Peng Zeng, Purdue University

Abstract

Consider a univariate response Y and a p-dimensional vector X of continuous predictors. Sufficient dimension reduction is intended to identify a subspace of X that contains all the information of X on Y. Central subspace and central mean subspace are the "minimal" subspaces that contain the entire information of X regarding the conditional distribution of Y given X and the expectation of Y given X, respectively. To estimate central subspace and central mean subspace, we have proposed two novel families of methods, namely, Fourier methods for central subspace (FC) and Fourier methods for central mean subspace (FM). For each of them, a candidate matrix is derived, whose column space is identical to the targeted subspace. Explicit estimates for the candidate matrices are derived when X is normal. When the density of X is unknown, kernel density estimation and local likelihood density estimation have been used to estimate the density function of X, and it further leads to different estimates of the candidate matrices. The asymptotic properties of these estimates have been established. We have discussed the procedure for estimating central subspace or central mean subspace from a given sample, and have discussed the selection of tuning parameters in the estimates in detail. When the dimension of central subspace or central mean subspace is unknown, either a bootstrap method or a series of tests can be used to determine the dimension. Additionally, we have discussed some hypotheses that are important in sufficient dimension reduction. FC and FM are compared with other existing methods by simulation examples, and they demonstrate competitive performances. FC is also applied to a real data set to show its power in application.

Degree

Ph.D.

Advisors

Zhu, Purdue University.

Subject Area

Statistics

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS