Gradient-Based Kernel Dimension Reduction for Regression
成果类型:
Article
署名作者:
Fukumizu, Kenji; Leng, Chenlei
署名单位:
Research Organization of Information & Systems (ROIS); Institute of Statistical Mathematics (ISM) - Japan; University of Warwick; National University of Singapore
刊物名称:
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
ISSN/ISSBN:
0162-1459
DOI:
10.1080/01621459.2013.838167
发表日期:
2014
页码:
359-370
关键词:
sliced inverse regression
摘要:
This article proposes a novel approach to linear dimension reduction for regression using nonparametric estimation with positive-definite kernels or reproducing kernel Hilbert spaces (RKHSs). The purpose of the dimension reduction is to find such directions in the explanatory variables that explain the response sufficiently: this is called sufficient dimension reduction. The proposed method is based on an estimator for the gradient of the regression function considered for the feature vectors mapped into RKHSs. It is proved that the method is able to estimate the directions that achieve sufficient dimension reduction. In comparison with other existing methods, the proposed one has wide applicability without strong assumptions on the distributions or the type of variables, and needs only eigendecomposition for estimating the projection matrix. The theoretical analysis shows that the estimator is consistent with certain rate under some conditions. The experimental results demonstrate that the proposed method successfully finds effective directions with efficient computation even for high-dimensional explanatory variables.
来源URL: